Elliot Woods writes with an extraordinary proof of concept: it couples the depth-sensing capabilities of Microsoft’s Kinect with projection mapping to effectively “scan” a 3D scene. It’s almost Holodeck good, from the looks of the potential here.

Kinect hack + projection mapping = augmented reality +hadoukens): Using the kinect camera, we scan a 3D scene in realtime. Using a video projector, we project onto a 3D scene in realtime.By combining these, we can reproject onto geometry to directly overlay image data onto our surroundings which is contextual to their shape and position.

As seen in the video, we can create a virtual light source which casts light onto the surrounding surfaces as a real light source would.

At Kimchi and Chips we are developing new tools so we can create new experiences today. We share these new techniques and tools through open source code, installations and workshops.

More on the Kimchi and Chips blog

You keep sharing, guys. Looks utterly brilliant. I hope to keep tabs on this project in particular. Also, great name.

Updated: The duo is Kimchi and Chips as it’s a girl + guy, Seoul + Manchester team, Mimi and Elliot. This makes me no less interested in trying the combination as actual food. I’m on it.

This video nicely shows some of the process of making this work:

  • http://www.jhhl.net/ jhhl

    Very nice – especially for a 1.0 device like Kinect. The latency is definitely there, but it's still exciting.  Since there's no need to build geometry, it can be a pretty nice technique for, say, projecting costumes on moving people.

  • Buffer

    I haven't been this excited about the direction technology is going until now. Kinect is offering a breath of fresh air for interaction designers.

    Loving it.

  • http://dietervandoren.net dtr

    last week i speculated it wouldn't take long before we'd see kinect-based projection mapping :)

  • http://www.last.fm/music/(noou) (noou)

    yeah, awesome… I must confess I couldn't help buying myself a kinect… now I just have to find the time to play around with it…

  • Naus3a

    Kinect's depth data is awesome for augmented reality; before the end of november i was able to create touchable virtual objects:&nbsp ;http://geekjutsu.wordpress.com/2010/12/02/dar-deep-augmented-reality/

    I love cheap exotic hardware :D

  • http://peelyoureyes.com Evan

    One of the best things I've seen. Just gets the mind jumping.

  • kirill_llirik

    I wait it pretty long time were some body create this…. I love it. Please don't stop develop this)

  • ale london

    wonderful! keep going!

  • http://youtube.com/lievenvv Lieven
  • http://cflickster.noisepages.com/ CFlickster

    thats freaking sweet. brilliant idea. I want a Kinect so badly. 

    i love that the kinect starts tracking his other hand at 3:30, and then he has to "virtually" put the light back in his hand… hahah. It would be cool to be able to throw the light around and catch it.

  • Maxou

    Hi, 
    since months friends and me want to do an actor vidéo mapping with kinect using OpenNI, OpenFrameworks and OpenCV . 
    we saw your vidéo, it's great and ! it's approximately what we want to do.
    could you contact us : maxou.ipp@gmail.com for some question (mostly about calibration)
    thank you