With all the discussion of Processing, I think it’s time to do a proper survey of who out there is actually using this tool to code custom live performance. Robert Hodgin aka flight404 famously VJ’ed with a lovely Processing rig back in 2005, controlled by four glowing Griffin PowerMate knobs. But with Processing a general-purpose tool, pumping out everything from bizarre, animated musical interfaces to data visualizations to interactive installations, it’s often unclear just how many are actually using it live.

If you’ve tried to use Processing live unsuccessfully, I hope to build up some reference materials to make this easier over the coming weeks. But I’m always interested in how people work, from beginners to advanced users. (See our in-progress surveys on Ableton Live and Reaktor on the music side.)

Are you using Processing as a live visual tool? What frustrations have you had? What’s been successful? Got any results you can share, in photos or video?

Do you wish that you could drop Processing sketches into another VJ tool, like Resolume or VDMX? (I know I sure do.)

  • http://www.moongold.me.uk/ Moongold

    I tried out using Processing live for part of set a couple of years ago using one of Robert Hodgins early Sketches.
    It worked really well and looked simply superb with the rest of the groups sounds. I keep meaning to try updating the code to work on my Macbook – memo to self.
    There is a tiny clip of EMC practicing with the Processing sketch up on <a href="http://moongold.me.uk/overview/&quot; title="Moongold Visuals". Scroll right down.

  • http://www.vimeo.com/1197976 Luke

    I used Processing + Supercollider3 for a concert at Mills College a while back.

    Successful: Setting up joystick and OSC control was easy. I was also able to use multichannel audio input and analysis via a jack connection to Supercollider. Oh, and it only crashed once in a two hour concert.

    Frustrating: Due to time and framerate constraints I ended up relying a bit too much on 3D boxes. Oh and it crashed in the middle of the concert!
    http://www.vimeo.com/1197976

  • http://superdraw.intervalstudios.com superDraw

    so, of course I used processing to create the version of superDraw that I used for years and years. I've very recently reprogrammed it (superdraw) in java but am still using parts of the processing core for some very simple stuff. Processing for me has been rock solid, never crashed (and I've played 4 hour sets here and there). I use it to send/receive OSC, midi, get joystick info, load and save data files to harddrive, and of course get data from the wacom tablet. I used the openGL renderer in processing to do all my rendering and have enjoyed framerates from 60-90 fps depending on what I throw at it (yes I do have an advantage in that superDraw is fairly simple).

    processing is easy to use, takes the pain out of programming by doing a lot of the tedious stuff for you, and is fantastic for simple one-off ideas.

    my one frustration (though minor and as ben fry has stated really probably beyond what processing is meant for) is that it doesn't have openGL pixel/vertex shader support built in. there are of course easy ways to get to it, and a few examples, but it's not actually part of processing (yet?).

    everything on the superdraw site was generated by the processing version of superdraw, I haven't posted any video or images from the new version yet. processing is awesome!

    =josh

  • http://createdigitalmusic.com Peter Kirn

    Wow, some great examples. Luke, I really enjoy the painterly bits in the example you have. I wonder what the source of your crash was … you weren't using video files by any chance?

    And Josh, thanks for that great answer. I think rewriting in Java is, of course, part of the idea.

  • wesen

    I did some vj gigs with processing, using a bunch of simplistic minimal animations that are synced to my drum machine (I just let the VJ stuff run on the side). I also used them on a punk concert and it came out really great. I built a kind of framework to easily tempo sync patches and to allow to change them using midi notes. Unfortunately I lost the better part of my animations in a stupid svn crash, but I will rewrite a lot of them for the next gig. Here is a very very very early crappy test (the music is crappy as well):
    http://www.youtube.com/watch?v=JdzuUhZ_Vjc

  • deastman

    Not exactly a VJ gig, but I recently used Processing for an in-house corporate tradeshow event. I created a cartoon character which was puppeted in real time via Wacom tablet and a Peavey PC-1600 MIDI faderbox, with audio input from a microphone driving automatic lipsync. Although this was a 2D character, I used P3D for hardware accelerated graphics, and threw in some 3D translation and rotation controls. Processing was easily up to the challenge, even when rendering at HD resolution, using an under-spec'd laptop, for display on a 52" plasma screen.

  • http://www.vimeo.com/1197976 Luke

    To Peter and anyone interested in using Processing-
    My crash was due to a memory leak caused by either intensive object allocation/deallocation, an excess of network OSC messages or maybe via the wonky jack connection…I wouldn't really blame the program's one hiccup on Processing, but rather the ability to add so many bells and whistles that I forgot about good programming practices, like error checking and memory profiling. Processing has been really good to me!

  • naus3a

    i love using processing for visualist stuff. i just think it still needs a little standardization for outputting vj gigs

  • http://www.stefangoodchild.com/ Stefan

    I use it in a hands off way when I play live. I have a simple setup that triggers random scene changes and camera changes on audio peaks and most scenes are audio reactive so it looks pretty varied though the set.

    I have a rendered version on vimeo which I did a few weeks ago just before my second live gig. Not my music in the render though :-)
    http://vimeo.com/1263182

  • http://createdigitalmusic.com Peter Kirn

    Really beautiful work, Stefan! I like what you did with the camera. Is this your own custom camera code? Did you look at another library? (Thought of making this a library?)

    If there is a new camera library at some point, I think a Battlestar Galactica Handicam mode is a must. ;)

  • http://www.stefangoodchild.com/ Stefan

    Cheers Peter,

    It's a couple of classes I knocked up myself and I'm pretty sure the code isn't good enough to turn into a library just yet, but I'm not averse to releasing the classes if I ever get to the point where the code isn't embarrassing. If you want to cast your eye over it let me know, you can stick them on the CDM labs if they pass muster.

    The handheld "chase" camera class is 100% inspired by the new BG series. I love the work they did in the space scenes and wanted to get a bit of that vibe going.

    I looked at OCD when I started out but just as I started to a new version of processing came out and broke it so I just had to roll my own. Once they had been written OCD got fixed. Typical!

  • wesen

    very very nice stefan, that's incredibly inspiring!

  • http://www.stefangoodchild.com/ Stefan

    ps.. This inspired me to get the classes tidied up a bit. Still not the best coding in the world or anything. They are very simple really.

    You can get them here

    Any tips on how than can be improved gratefully received!

  • http://pir2.org outpt

    Peter, you saw my Processing visuals at HOPE. :) It was my first time VJing and it worked out well enough for me that I plan on adding to my app. I posted a video here of what my stuff looks like on its own: http://vimeo.com/1463104

  • http://motscousus.com/stuff/2008-02_Processing.org_patterns mots

    i use live for visual performances.
    Patterns based on the "Nine Block Patterns", controlled by keyboard, midi, and sound though the line input (FFT and beat detect)
    http://motscousus.com/stuff/2008-02_Processing.or
    http://www.youtube.com/watch?v=gmpb9YYEPLU http://www.youtube.com/watch?v=zAc2B6pdXug

  • grigori

    would love to learn more about it, for sure for the live performance point of view with ability to load scripts as you would video clips.

  • http://www.blprnt.com blprnt

    We built a performative installation for the Vancouver Art Gallery using Processing last month – 4 live camera feeds and 4 semi-independent projected 'screens' incorporated with the gallery's architecture.

    Processing was mostly up for the task – but video performance is notoriously slow. We are considering OpenFrameworks for the next version of our installation, to try to get a bit more speed out of things. Another option may be to handle video through OpenCV instead of Processing's native video libraries.

    There's a semi-detailed post here:
    http://blog.blprnt.com/blog/blprnt/glocal-at-the-

  • http://mf.grimaceworks.com Michael Forrest

    I have started using Processing to make visual instruments to use as part of my live sets. See this: 'Falling Leaves' – this is a screen capture of what it sounds like when I do it. I quickly came up against the limitations of the Processing IDE though so this was put together using PApplet but using Eclipse / normal Java. Which is annoying because I can't export it nearly as easily (so for my next project I am going to write something as a library in Java and then use the Processing IDE to import those libraries with a minimal bootstrap project so I can get some executable files).

  • http://Lx7.ca Vergel E

    Because Processing can be such a CPU hog, I output a couple of different audio visualization videos and compiled them int Resolume to make a music video…

    for a first attempt, the results came out pretty good. Trig Calc Math – Antithesis