The new, svelte-looking Kinect. It’s not that it looks better, though, that matters: it’s that it sees better. Courtesy Microsoft.
It’s a new world for media artists, one in which we look to the latest game console news because it impacts our art-making tools.
And so it is that, along with a new Xbox, Microsoft has a new Kinect.
The new Kinect uses standard infrared tracking (ideal for in-the-dark footage and accurate tracking), but also returns RGB imagery. It’s 1080p, 30-60 fps (it seems tracking is at 30 fps and video at 60, but I’m reading conflicting reports). Hands-on reports say latency is reduced. If the finished product is consistent with rumors, that could be owing to more in-hardware tracking analysis; once you get to trying to do the analysis on the computer (or console), you encounter additional bottlenecks. Now, musical readers have much greater expectations of low latency than gamers, though, so it’ll be interesting to see this in practice.
The big news is tracking that gets closer to your body, breaking analysis into smaller bits. Wired, granted exclusive early access, goes into some detail about the way the tracking tech has changed. Instead of a straight depth map created by producing a 3D picture of two separate infrared-based camera images, the new tech uses “modulated” IR light. Given that this is new technology, I’m not yet clear on the specifics of that, and would love some reader feedback. (Ahem.) Continue reading »
There’s a reason for Processing’s popularity. By making code simple, elegant, and direct, and catering directly to the kinds of visual ideas creative people have, the environment has made programming accessible to artists and designers in a way nothing else could.
Coding no longer has to be a source of fear, or a bad word.
I still love Processing as a way of sketching out ideas, and with strategic use of the GPU in its now-native OpenGL rendering, it can also be surprisingly high-performance.
Of course, that 2.0 reboot has been a long time coming, enough so that you might have even forgotten it was enroute. That’s why the recent 2.0 beta 9 is big news. It includes some major new features that finally reveal what 2.0 is all about – and bug fixes that have been a long time coming. In fact, it’s that moment when the betas stop looking so much like betas. Here’s what to expect. Continue reading »
Live performers simply want a way to have more intimate relationships between music and visuals onstage. That means whether they’re working solo or with a live visualist, being able to get useful signal between music and visual tools and performance elements.
Livegrabber has got to be about the easiest way to do this I’ve seen yet. It’s actually a suite of plug-ins for Ableton Live’s Max for Live environment that spits out OSC (OpenSoundControl) messages to any visual tool that can respond.
And as you can see in the video, the results are both effortless and profound.
It’s best seen in this example with VDMX. (Any OSC tool will work – I’m rather keen to code around this with Processing, for instance, just for kicks. But speaking of VDMX, that superb tool is on a fire sale for a hundred bucks this month; act fast!)
Melbourne-based artist Jayson Haebich has rendered in thin air, literally, architectural forms in color. He uses custom software to map lasers through particles to produce an ephemeral sculpture of air. The results are gorgeous – frozen digital motion.
From his description:
These are a series of static light sculptures that have been created using laser light, smoke, shadows, physical shapes and custom built software to create complex compositions of shadow and light that play with the sense of depth and perception. These pieces challenge the observers idea of perspective and ask them to consider what components of the installation are physical objects and what are non physical objects such as light and shadow. These sculptures fill the room with light and colour creating an almost mesmerizing effect as the beams of precisely mapped light from a laser cut through swarming particles and haze suspended within mid air to create almost solid looking planes of illumination.
These pieces were created using custom built software that is used to map out physical features using a RGB laser and was created using OpenFrameworks.
Take your visual app. Take a parameter. Now control it with timelines and automation – anything, anywhere.
The appeal is clear for visualists, particularly with built-in options sometimes limited. Now, sequencing control – clocked to MIDI – can take on powerful new dimensions.
Vezér isn’t the first app to go this direction, by any means. I’ve admired in past the insanely-powerful IanniX, for instance. But then, “insanely powerful” isn’t always what you want – particularly if that depth just drives you insane trying to do something simple. Vezér, just now entering beta, promises to be a bit simpler. And coming from the developer of CoGe, it also arrives from a coder with a track record.
We’ll be watching for this to come out, but in the meantime, its creator gives CDM a first look. Details:
Vezér is under development, no public beta available yet.
You can have any number of compositions and each composition can contain any number of tracks.
There are different track types in Vezér, like single Midi CC message or Midi CC range.
Vezér supports Undo/Redo in the whole application and also supports Copy/Paste of keyframes.
The playback speed of a composition is adjustable and can be synchronized with a Midi Clock.
The resolution – FPS – of a composition can be set.
Vezér supports sending of Midi CC messages.
Different interpolation can be set for each keyframes.
Vezér support recording of incoming Midi signals.
Vezér can be controlled via Midi or OSC.
Vezér is a 64 bit application for Mac OSX 10.6 or later.
Vezér will be a commercial application.
Body mapping and dance/visual fusions are still explored only in fits and starts, compared to the extent of live music and visual performance in other media. So, it’s encouraging to see this latest experiment from dancer Christian Mio Loclair. Working with Microsoft’s Kinect, the slowly-undulating tendrils of visuals behind him create visual counterpoint for headstands and hip-hop dance techniques. Far from running up against latency, here there’s a sense of visuals that answer the moves with a slow sigh, creating a kind of living architectural space behind him.
Christian muses to CDM, “I am very convinced that especially the Kinect and the upcoming Kinect 2 will change the way dance will be performed. I hope I can contribute to this development … just some streetdancers, hackers and a Kinect. I wanna see how far we can get by open source Code, own code and Open Dance.”
Blast from the past: this color organ is from 2007. But it’s a beautiful demonstration of light and sound, fused into a single interface, and thus worth mentioning as I pull together notes for a talk at Mapping Festival tomorrow here in Genève. Compare the 60s-vintage Lumigraph of Oskar Fischinger, which I write about today on Create Digital Music.
In gooey pinks and purples, traced with imaginary sparks, the game controller-manipulated system resembles looking into the heart of a great jellyfish made of plasma.