Most of the emphasis on working with multi-touch and alternative controllers has been on our sister site, Create Digital Music. But in a way, visuals are even more demanding of new hardware. After all, musicians have all kinds of hardware that work perfectly for performance (keyboards, knobs, drums, violins, sousaphones, kazoos, and whatnot). But new visual performance media demand something different if they’re to evolve.

Oh yeah, that, and most pro visual apps are kind of a b**** to use with a mouse and aren’t all that much better with a tablet. (Unless you’re somehow discovered the secret and find a Wacom as easy to use as a ballpoint. Please, tell me how.)

That makes this tidbit all the more interesting:

Jazzmutant is proud to have been selected by the Siggraph Emerging Technologies Committee in San Diego to demo a new prototype device for digital imaging involving multi-touch control. This solution will go beyond mere finger-drawing and clearly illustrate a new way to interact and improve productivity with drawing and video editing software. Furthermore, the solution presented will be the very first multi-touch enabled Tablet PC shown to the public.

JazzMutant news

What’s that now? Visual editing on a multi-touch surface? JazzMutant is best known for the creation of the Lemur multi-touch hardware. It wasn’t specifically intended for music, but that’s where it got most attention; you can, incidentally, route its native OSC control to Processing, Max/MSP/Jitter, Pd/GEM, Flash, and so on. But it was pricey (US$2500), and while you could design your own interfaces for it, it wasn’t quite the same as having a computer.

Now we get a one-two punch of tantalizing possibilities: a controller specific to visuals, whatever that may mean, and the possibility of using an actual computer with multi-touch input. I’d love to have that with some of what I’m building with Processing these days for performance. I’m a little more skeptical on the visual hardware side, only because so far that has tended to mean a selection of templates for Lemur-like hardware. But either way, this is promising — we’ll be watching the news out of SIGGRAPH very closely indeed.

  • http://UKMAC.co.uk James Rickards

    Having seen what the iphone 'multi-touch' can do I think anything is possible – such a small device doing so much! – lets hope we see more of this implemented into every area of technology!

  • john dalton

    Multi-touch is pretty cool. But don't be so quick to write off the existing Wacom tablet-pen combination. In reality sometimes it's much better to be able to not have to reach out and touch the actual interface, so the fact that you can hold the tablet in your lap for working with the pen while looking at the screen somewhere else can be an advantage while working in some situations. Something like the Intuos tablet also gives you the ability to use pen tilt, tilt orientation, pen axis rotation (with the artpen), and the airbrush wheel in addition to pen pressure as interactive real time modulators. So you do have the ability to be simultaneously modulating several different parameters when used interactively in a performance situation just by how you hold and manipulate the single pen.

    You can also use something like pen tilt or tilt orientation of the airbrush wheel to modulate graphics in real time without actually touching the tablet (while hovering in pen proximity to the device), so there's something you would not be able to do with a multi-touch device that can be a useful source of real time controller information. Personally i've found that there's an initial learning hump you need to get over as far as feeling comfortable using the pen and tablet, but once you get over that hump it's actually quite easy to use and can be a big ergonomic help in avoiding hand fatigue or carpal tunnel issues with repetitive editing tasks, certainly when compared to using a mouse for a long period of time.

  • http://future-tense-cpu.com cyberpatrolunit

    Well I visited the event.

    I had a hands-on with the prototype; Honestly, it was a Lemur with a laptop inside of the unit. The software running was bogged down and slow. The idea and proof of concept were great – I just was not all that impressed.

  • cobalt

    cyberpatrolunit,

    What was the software like? How did the whole multi-touch bit get used? Very curious, thanks.

  • http://future-tense-cpu.com cyberpatrolunit

    I don't have much to saw about the software, I really only watched JazzMutant demo for some time. Here is my reflection on this;

    When I initially heard the news, read the hype – I knew I had to see the tablet prototype. Being an avid user of the Lemur, I am very familiar with the device's responsiveness, the parser math, the physics. On the other hand, I understand using a laptop inside a Lemur running vector based drawing software is heavy on CPU resources.

    I suppose I had my heart set on seeing a real new piece of hardware. I was hoping to see a new device with the responsiveness of a Lemur, Video out (DVI) – live video editing / performance tool – input / output device taking a direction into 3d space. Something heavily gesture based, quick, swift, intuitive.

    I am so used to tweaking visuals for video performance using Modul8 and the Lemur – I thought that JazzMutant's prototype was heading in this direction, but instead (for me) it looked like Adobe Illustrator running on minimal hardware, with a few new finger tricks.

  • cobalt

    Thanks for the info, cyberpatrolunit. It sounds as if the hardware appearance is very similar to the Lemur/Dexter. I wonder how far they are from finishing up prototype development.

    Anyway, thanks again.