Syphon Teaser from vade on Vimeo.

In the audio realm, piping audio and MIDI between apps is commonplace (see ReWire, JACK, Soundflower, IAC MIDI, etc.). But imagine if you could take textures and frames from one app and share them, live and real-time, with another app. That’s the vision of Syphon, a Mac-only, open-source framework that promises to share graphics and live video between any visual app on the platform, from 3D apps to live VJ/video tools.

Syphon is in development, and not everything is public-facing yet, but it’s moving incredibly fast, and we’re able to take a first look. (The software is currently in a quite-stable private beta.) First, its creators sum up what it’s about more eloquently than I can:

Syphon is an open source Mac OS X technology that allows applications to share frames – full frame rate video or stills – with one another in realtime. This means you can leverage the expressive power of a plethora of tools to mix, mash, edit, sample, texture-map, synthesize, and present your imagery using the best tool for each part of the job. Syphon gives you flexibility to break out of existing graphics pipelines and mix creative applications to suit your needs.

Sharing frames happens behind the scenes so you can minimize or hide your apps and frames still flow. Frames shared via Syphon support an alpha channel, so rendered 3D content, masks, keys and transparency all work as expected. Wherever possible, published frames stay on the graphics card, so Syphon is fully hardware accelerated, and does not duplicate resources unnecessarily. This means Syphon is fast. You can share HD video and larger in realtime with little overhead between applications.

Lastly, Syphon is designed for and by new media technologists, realtime video performance artists and visualists.

Syphon Introduction from vade on Vimeo.

Let’s put this in simpler terms: Syphon is therefore all about collaboration, whether it’s working seamlessly across tools solo or, as described below, opening up more fluid interchanges of ideas with another artist.

Anton Marini, who co-created the first implementation with Tom Butterworth, tells us more.

The genesis of Syphon for me is essentially two-fold. I had been working with Mary Ann [Benedetto] (my better half, aka outpt) and wanted to perform together in some way. Neither of us had a video mixer at the time, and investing 1K in a standard definition analog mixer (V4 and friends) seemed completely insane by every measurement, even though there was no talk of the Spark DFuser at all at the time. [Ed.: That's the awesome community DVI mixer initiative; more on that soon, as the world that has both the DFuser and Syphon in it is looking very happy, indeed.]

She also used Processing, and I was moving between Max/Jitter and Quartz Composer/VDMX and my own custom Cocoa VJ app. The solution I came up with to allow Mary Ann and I to work together was to make a hardware-accelerated screen capture plugin for Quartz Composer that would allow VDMX to capture the rendering of a Processing sketch that ran locally on my system, which Mary Ann controlled remotely from her machine. She built a front-end controller in Max/MSP and used that to control the Processing sketch, which I then had on screen, and brought into VDMX into my mix.

This was fragile, but worst of all, it also meant that it had to capture on screen pixels properly and anything interfering or occluding the window in the capture area would show up on screen. Suffice it to say it was non-elegant, clumsy and took up extra monitor real estate.

I asked around at the time if there were ways to share actual texture resources between apps, and the universal answer was simply “not possible”.

The other part of wanting to do this is my background in video engineering, and using Max and Jitter, routing video around (think video patch bay) was second nature. Why can’t this happen with software in a highly performant way? It seemed obvious.

Now it is, at least in OS X 10.6, thanks to a new API called IOSurface. [Ed.: For more on IOSurface, see Hidden Gems of Snow Leopard: IOSurface on the CocoaAdHoc blog.]

My theory as to why IOSurface exists — and why it’s specific to Mac OS X — is that it’s the result of the convergence of the age of QuickTime, the move to QTKit/Cocoa, and the lack of 64-bit support in QTKit for Cocoa. This is all conjecture — I have no idea if this is true — but I know enough of the status of the APIs to think it’s likely one cause for IOSurface’s creation. [Ed.: Just to reaffirm that point - the ways of OS framework development are mysterious, but I agree with Anton that it may at least have been a significant contributing factor.]

QuickTime is old and funky enough that porting to 64-bit is non-trivial, yet devs needed QTKit to function in Cocoa land. Cuee the move to 64-bit capable OSes (10.4 initially, then 10.5 and now full 64-bit 10.6), and APIs had to be updated in all levels of the API stack. QuickTime was (and still is) still an odd man out with regard to full 64-bit support (many things in QTkit won’t work or can’t be done unless you fall back to 32-bit-only APIs in the older “QuickTime” Library). So a cross-app API was presumably needed to pass frames from a full API 32-bit process (for access to 32-bit-only QuickTime APIs) to a front-end, 64-bit host app. There needed to be a way to pass frames too and fro with very little to no overhead. I believe this is why IOSurface was created. The need (and lack of functionality in QTkit to this day) is also what made it public rather than for internal use only.

Once 10.6 came out and IOSurface became a public API, some folks took notice. However, there was no documentation and no announcement on its intended use.

Once it was apparent what IOSurface could do, I emailed a group of commercial VJ and media developers, and tried to get a joint development effort going on a ‘Open Video Tap” project. Folks were interested, but no one knew if it would work.

Months later, some small examples started popping up from experienced devs and random mailing list postings. At that point, I was working extensively with Tom Butterworth on QC plugins and other projects. We ended up doing some basic IOSurface work for a 64-bit QuickTime plugin for QC. Once we figured out how to get that working, we went back to prototype a basic framework to allow frame sharing to happen easily.

Once Tom got a hold of the framework, he put in an amazing amount of work to polish and optimize it, and really make it shine. He re-engineered it in a way that made clients and servers very tolerant of interruptions or crashes — a client could not bring down a server and vice versa. That is, he solved the hard problems. :)

As it stands, we have fully working and debugged implementations of Syphon for Quartz Composer, FreeFrameGL for Mac, and Max/MSP/Jitter. We have a partial implementation for [game engine] Unity 3D Pro 3.0, and are working on a few more. All of these implementations are Mac-only, because they rely on the 10.6 specific IOSurface framework. [Ed.: Check out Processing - that's a high priority, if anyone wants to claim the bounty and has some JNI and OpenGL experience. -PK]

Hopefully that fills it in. Now I can use AUVI with VDMX via Syphon – and many other combinations. There’s lots of fun to be had, and no more compromising on features :)

One side note: why shouldn’t this be possible on other platforms, given the action is all happening on the GPU? Odds are, that returns to Anton’s theories about how it came about in the first place. Whatever the reason, the facilities currently exist only on the Mac, though more fundamentally, the reason this wasn’t a feature of graphics driver architectures to being with may simply be that no one saw an obvious need. Once you see the results in action, the power of such a facility is apparent, so perhaps greater development on other platforms would become feasible. On other platforms, this would require the support of graphics driver vendors (AMD, NVIDIA, Intel) – and would have to take place without having one vendor (Apple) to demand it – so for now, expect this to remain Mac-only. (Graphics driver writers, if you’re out there, and need us to make a case to your bosses…)

Back to its original purpose, what’s incredible about being able to pass frames in real-time between applications is that it transforms the role of an application for visuals. Instead of being an island, a single tool is interconnected with everything else you run. It’s a fantastic feeling, the sense that the computer really is an open environment.

As Anton puts it, the hope is to build an “ecosystem” around shared visual tools.

Stay tuned for more as Syphon becomes public and more client support is made available.

Video Demos

Quartz Composer + VDMX

Syphon for QC & VDMX from vade on Vimeo.

FreeFrameGL (Mac) + Resolume Avenue

Syphon for FFGL & Resolume from vade on Vimeo.

Max/MSP/Jitter

Syphon 2K & 4K Jitter Demo from vade on Vimeo.

  • http://www.georgetoledo.com George Toledo
  • http://julapy.com/blog/ julapy

    wow, this is truly revolutionary.

    will definitely be keeping an eye out for syphon as it matures.

    + would love to see it make its way into openframeworks and processing.

  • Blair neal

    As soon as I heard about this I was blown away..this is one of those rare game changers that comes along once in a great while. A little birdy has told me that there is an openframeworks add-on in the works, but processing is apparently a little more difficult…

  • http://vade.info vade

    OpenFrameworks is 100% doable. I want to have an ofxSyphon ready to go for release but, you know how these things go, we are not promising anything, but there is still time.

    As for Processing, its a bit more involved due to needing to shim our Obj-C Framework into Processing OpenGl context, and insert our calls. Java Native Interface wizardry and some rather intimate knowledge of processing's rendering back end. I think ill pass the buck on that one and let someone else have all the fun ;)

  • bilderbuchi

    awesome stuff! now there's hoping for a cross-platform solution… :-(

  • http://www.decollage.tv rolin

    That's great and promising news, can't wait to give it a try! Great work!!

  • Sam

    Amazing, I have been waiting for something like this for a long time. I will be upgrading to 10.6 for this.

  • Peter Kirn

    Anton and I have talked about this separately, but the smart way to go with Processing I suspect is to work with GLGraphics, not the default OPENGL engine in Processing. In fact, who knows, maybe we'll find out there's some way to hack something similar on other platforms, at least with Processing itself… but this is special what's on Mac OS, for sure.

  • http://www.twitter.com/vjbridd bridd

    Openframeworks is something I'm really hoping to be able to link this in with, so I really hope there is the time for it for the release! :)

    Great work Vade + Tom this is going to be very much appreciated!

  • http://www.resolume.com bart

    Great work guys! Would love to test it in Resolume.

  • http://coge.lovqc.hu .lov.

    Awesome work! CoGe will have a Syphon Server feature, i'm sure :)

  • http://pixlpa.com Andrew Benson

    Anton+Tom, excellent work. This might just change how a lot of us work.

    How many times have you not used something that looked perfect for a project just because it was in the wrong application? Nobody wants to completely redo their whole rig just to be able to take advantage of a specific plugin, effect, or patch, but it happens. I look forward to seeing how this gets put to use!

  • http://vizzie.com vizzie

    wow. this is so cool. Would there be a way to route the video from Apple DVD Player in to other applications, so I could mix DVDs in to Resolume?

  • http://www.ilektron.com Mudo

    Spreading!

  • ext

    cool!

  • http://mediapathic.net mediapathic

    Oh man, vade, bringing it again. I remember discussing this with you at hope a few years back and being all impressed with the hack to bring outpt's stuff in. And now look at it. Cheers!

  • Buffer

    I was just thinking yesterday how cool it would be if i could bring openFrameworks into Jitter and then this comes up! Agh…love it.

    Thanks for putting this together, this could be quite the liberating experience.

  • http://www.digitalmediajockey.com Kevin Hackett

    I've wanted something like audio/midi Rewire for video for a long time. Great news. would it be possible to get the output of apps running on a network?

  • http://timothybright.com Tim

    Wow,

    This is really exciting. I was using Vade's earlier screen capture plug in on a number of projects and as exciting as it was I had the same issues as Vade describes – essentially it was awkward.

    Thanks so much to everyone that worked on this.

  • http://- tiago morgado

    the thing I was needing to justify buying a equipment for videos such as a filming camera, a photo camera, macbook pro, and some licenses of visuals software (kineme, vdmx, etc.), and dive deeper into visuals

  • Pingback: Leave No Trace » Mixing visuals

  • http://www.chromatouch.wordpress.com Leon Trimble

    I've wanted something like this from the beginning, and have tried various things to achieve it like camtasia studio, firewire/gigabit ethernet streaming, video capture, etc. etc. but was never satisfied and always wondered why it couldn't be done in the gpu…

    so this is great, can't wait to give it a whirl. and i think it's time to buy vdmx.

  • http://www.vlab.pe r4f0

    Thanks, all we wanted from a long time, genius!!!

  • http://www.newoperahero.com Mike

    Any PC equivalents of this at all? VVVV can do it internally, and I guess debugmode frameserver + avisynth is another way to do it, but I would love a generalised way to do the same job that worked with evrything.

  • http://vade.info vade

    I am not personally aware of any method to share textures across applications on any other platform. That does not me there isn't, but I've certainly not heard about it.

  • Ryan

    I find it sad that Paolo Località gets no explicit mention here.

  • Jellz

    Any idea when there might be a beta release? I have been looking for something like this for my new project and couldn't write a jitter patch that worked myself!

  • http://vade.info vade

    It will be ready when its ready.

  • http://www.ericparren.net Eric

    Will there be support for Cinder?

    And (maybe a stupid question) can these IOSurfaces be shared over a network with Syphon? That would create awesome possibilities for collaborations.

  • Pingback: Create Digital Motion » Removing the Walls Between Mac Visual Apps: Syphon Beta, Projects, Mad Mapping

  • http://frontiernerds.com Eric Mika

    This is amazing. Sorry I missed the demo @ITP today.

  • Pingback: Swedish VJ Union » Blog Archive » Syphon – a new revolutionary technology for OSX

  • http://www.vjrama.co.uk rob

    i read the top discription saying its gonna be mac only what a shame if it was both mac & windows then magic could realy happen like mixing windows only software with mac only softwares from 1 computer to the next maybee in a later release fingers crossed that would be awsome the stuff dreams are made of the possibilitys of such a software blows my mind having both xnth & quarts composer that would be slick plz consider a windows release aswell the sales of both windows & mac users thats gonna be a big wage packet

  • http://www.visualdivision.co.ik Visual Division

    Great Idea! Testing with Resolume Ave now. I hope this doesn't go the way of ReWire though.