Time for a skeleton dance. Synapse, seen here doing user skeleton tracking on the Mac.

Time for a skeleton dance. Synapse, seen here doing user skeleton tracking on the Mac.

While we wait for Microsoft to send the new Kinect – yes, we’re on the list for one here at CDM HQ – there’s still plenty to be done with the current generation of Kinect. And it’s likely that you’ll find even more of these on the cheap when there’s new hardware out there.

The problem is, apart from using Microsoft’s prescribed development tools on Windows, working with Kinect can be a bit tricky. What if you want to plug in a Kinect and play around quickly to try some possibilities? Or what if you want to work with collaborators or students, and don’t want them to have massive, potentially-problematic installation requirements. (MacPorts, I’m looking at you. Mac’s not Linux, is it?)

Here are some of the easier solutions. Readers, it’d be great to share feedback; let’s talk. Our hive mind is I’m sure much smarter than my individual brain. (Purposely left off this list: solutions that work on one OS, or require more expensive licensing for commercial use. My goal here was things that would lead to easy collaboration or teaching. For more specific solutions, of course, you might well decide an investment in one or something that’s OS-specific makes sense.)

With these tools, it’s possible to get a driver-free installation or something with a straightforward driver installation, and generate OSC (OpenSoundControl) messages that can be read by any visual software (or, with a little more effort, any music software).

Synapse by Ryan Challinor is a ridiculously simple, ridiculously effective solution. It simply bundles together the data you most need – joint information from the skeleton tracking – and fires it off as OSC messages. Better still, there are already examples for Quartz Composer, Ableton Live, and Max. So you can begin playing with it right away. (I’m working on one for Pd; stay tuned.) Little wonder it works well: Ryan is at Harmonix and built the gesture menu for Dance Central. Bless you, yet another geek willing to keep geeking out after work hours end in the name of digital art.

Synapse requires no installation on OS X, and there’s a relatively simple installation process on Windows. (At least, I’ve found the drivers for Windows to be more reliable, and there’s nothing like MacPorts – just some driver installers and a restart.)

The only caveat: Ryan reports hardware labeled Kinect for Windows and even some newer Kinect for Xbox hardware may not work; I’m investigating.

Synapse for Kinect

Thumbs up for KinectA.

Thumbs up for KinectA.

KinectA goes further. Like Synapse, it’s install-free on the Mac and requires some reasonably-manageable driver installs on Windows. But it allows you to choose between object tracking, hand tracking, and skeleton tracking. That’s a bonus, as skeleton tracking requires the infamous “goalpost” gesture before it starts working, and can cause any number of problems in installation situations with users – especially multiple users. Sometimes simpler tracking works better.

As with Synapse, all of this is sent out as OSC – and again, I’ll work on some Pd patches for you to parse the results (and maybe Processing, if I have time).

With multiple tracking methods on, this was pretty hard on my (mid-range, 2010) i5 on my MacBook Pro, but it’s still promising.

KinectA for Kinect

Tracking skeletons in KinectA. Courtesy the developer.

Tracking skeletons in KinectA. Courtesy the developer.

Those two are the only two that allow driverless install on OS X, but I should give an extra nod to the rather-popular OSCeleton. It does require the OpenNI driver install on all three operating systems, but if you are serious about tinkering, you may want to bite the bullet and do that. And there is a package available that gives you the whole setup on Ubuntu Linux, if you don’t care to build from source (though I suspect Linux may be the easiest OS to deal with in that regard).

Even with the installation and my aforementioned apprehension about dealing with students or friends on (cough) troubleshooting, what you do still get is a bunch of OSC messages you can process – and there are examples in Pd already built for you to begin working with the data. So, this would absolutely be my first choice if I were anxious to tinker quickly on Linux, and possibly on other OSes, as well, depending on needs.

OSCeleton on GitHub

I’ll be working with Kinect as input with some with students in Krakow, Poland next month, here:
patchlab.pl

Expect some documentation after the fact.

Now, I’m very keen to hear from readers. What are you using? What hasn’t worked; what have you found frustrating? What has? Are you working as an individual artist, student, teacher, pro? Are you building Kinect apps for Harmonix?

  • Julius Tuomisto

    NI mate runs on Win/Mac/Linux and comes with all the necessary drivers (both OpenNI v1 and OpenNI v2 versions available): http://www.ni-mate.com

    Another tool to check out is Ethno Tracker (available for Windows): http://ethnotekh.com/software/ethno-tracker/

    • http://pkirn.com/ Peter Kirn

      Yeah, I only left out these licensed solutions – not because I’m opposed to them, but only because if you’re trying to start quickly with a collaborator or work with students, you can’t necessarily buy a license; you may want to start with something you can use for free and then work from there.

    • Julius Tuomisto

      Hi Peter. As you might know we develop NI mate and though I do understand your point, I have to attest that many of our users actually are students. Though it obviously doesn’t give you full functionality and you have to restart it once every hour, a free trial actually never runs out for NI mate.

  • Michael Todd

    Don’t forget FaceShift – amazing facial tracking (but a bit expensive for commercial use)

    http://www.faceshift.com

    • http://pkirn.com/ Peter Kirn

      Yeah, I left out those more expensive solutions.

  • Michael Todd

    I also have a couple of maxforlive devices – one that takes takes the synapse OSC stuff and maps to parameters, and another that puts the kinect camera feeds into the (very excellent) vizzable and v-module systems.

    http://www.maxforlive.com/library/device/704/kinect-osceleton

    http://www.maxforlive.com/library/device/949/kinect-camera

    • http://pkirn.com/ Peter Kirn

      Ah, cool, thanks for these!

  • i.m. klif

    I use a rather unusual Kinect setup.

    I use Kinect to get a 3D scan into computer, then I’m converting scan data to audio in order to display it on analog CRT oscilloscope (it is actually Rutt/Etra scan prosessor made with MaxMSP using regular audio card). Actually, once the image is converted to audio, the patch follows classical modular video patching signal path – very much like LZX Industries modular video synths. All effects and modulations take place in audio domain – there is no single video effect in this patch.

    It works great. The only limitation I see is low raster resolution, about 100 lines as opposed to 576 – soundcards have much lower bandwidth than dedicated video equipment that’s normally used for vector rescanning (google Dave Jones and LZX industries). On the other hand, I have much more freedom in regard how scan lines behave – unlike true Rutt/Etra they can be of any frequency and orientation.

    Early demo:
    https://vimeo.com/43432710

    Installation proposal:
    https://vimeo.com/56725330

    • http://pkirn.com/ Peter Kirn

      Wow, excellent! And yes, indeed, very analog in conception.

    • i.m. klif

      It’s amazing how well oscilloscope and Kinect work together.

      A real odd couple ;)

  • michele

    I use mappinect. It was really promising at the beginning, but isn’t updated since long. However has some unique features: through a xml language, you can descrive which event to occur whenever some conditions happens: like send an osc when my hands touch, and so on.

  • Felix Hoenikker
  • Tanner Thompson

    So I just got a kinect and have been trying to get some of this to work – Ive tried on my retina as well as my 2007 macbook pro – they both start up and begin to track my skeleton – using both synapse and kinectA – after about 20 seconds though – the flashing green light on the Kinect stops flashing or skips a beat and my applications freeze or stop recieving data from the kinect…. anyone else have this problem – I got the xbox 360 standalone kinect from best buy

    • Tanner Thompson

      So if anyone has issues with their kinect – I had to take mine back and get a used one at GameStop – look for the model 1414 – not the newer 1473… just a heads up

  • SkyRon™

    hi peter and friends— (sorry i’ve been out of the loop recently—getting into a new
    semester, new location, labs, etc.)—submitted for your perusal, my group, media experimental ensemble™ (meme™)—performing on kinect-controlled PureData theremins and samplers, and on Quartz Composer patches: http://badmindtime.wordpress.com/2013/09/25/the-lavender-spray-performance-with-kinect/

    The current roadblock is Mountain Lion, so we welcome any suggestions!