Blast from the past: this color organ is from 2007. But it’s a beautiful demonstration of light and sound, fused into a single interface, and thus worth mentioning as I pull together notes for a talk at Mapping Festival tomorrow here in Genève. Compare the 60s-vintage Lumigraph of Oskar Fischinger, which I write about today on Create Digital Music.
In gooey pinks and purples, traced with imaginary sparks, the game controller-manipulated system resembles looking into the heart of a great jellyfish made of plasma.
Nail the finger fireworks of a particularly hard Rachmaninoff, and you may well feel like blasts of light are shooting out of the piano. But to give the audience the same sense, a DIY instrument made of cardboard and homebrewed responsive lighting translates that keyboard virtuosity to an optical show. Reader Aylwin Lo sends us this project out of Canada:
I’m with YAMANTAKA // SONIC TITAN. We’re an art collective based in Toronto and Montreal that is most known for making music and putting on dramatic live shows. People like Pitchfork, Vice, and MTV Iggy have nice things to say about us.
We made a video of our keyboard player pulling off a notoriously-difficult Rachmaninoff composition on a special piano we constructed
from an electric piano, a cardboard baby-grand shell, and a homebrew, Arduino-based LED-controlled light rig, and we thought you might like it.
You thought right. Now, if you want to play piano like this, you … uh, have some practicing ahead. But in a novel twist on crowd-funding rewards, they’re also using this very artist to help – even as they work on a video game. (Stay with me here.) Continue reading »
Entropy never looked this good. Or, certainly, it looks a lot better than when I broke that beer glass. (I know, I know: there are reasons why we don’t want to live in a universe where doing that would make the glass re-assemble itself from its fragmented shards.) Photo courtesy the designers.
In an elegant, balletic dive, taking an almost impossibly-long span of time, a single droplet of water falls and splashes, an animated logo peeking out from the inside. But it’s what isn’t there that may surprise you. There’s slow motion camera behind the scenes, meaning the usual way of doing this is absent.
Instead, what you’re seeing is a stop motion time lapse – a record of the shifting patterns of entropy in nature, thousands of different droplets appearing as connected that in reality are not. It’s a trick of animation and high-speed lighting, not high-speed photography, stroboscopic illusion.
And it’s a fun DIY project, to boot. 3D and robotics? Okay, we’re in.
The Barcelona-based studio that produced it shares not only the animated result (including a logotype for design mag IdN, but also a short film explaining the making of. Physalia’s Belén Palos writes us:
We got your contact through Alex Trochut, who shares studio here in Barcelona with us. We are big admirers of Create Digital Motion, so we would love to share with you our new piece Entropy, a joint effort from our 3D division and robotics Lab, in which we created a system to capture the fall of a water drop without a slow-mo camera- with replacement animation mapped inside the drop! There’s an Arduino involved, as well as our self-developed Motion Control.
Mapping: it’s kind of everything. It’s the projected image mapped to the surface. It’s pixels mapped to lights. It’s the control layout you use on your iPad and your fader box mapped to parameters in visual output. It’s the translation of music to lights. It’s the range of color on the filter. You’re constantly mapping one thing to another.
And of course, the community of people who read this site are generally somehow undertaking the difficult task of mapping across media, as you map visual performance to music.
So, it’s fitting that “mapping” at Mapping Festival is about more than projection mapping, as the Geneva, Switzerland audiovisual happening kicks off this week.
I want to talk a bit about musical mapping. For a panel I’m leading at Mapping on Saturday May 11, I’ve pulled together a few of the artists at the festival who deal with that issue of musical translation and image. Each of these artists works at that touch point of sound and picture. Continue reading »
In a teaser video just released by Spain’s Things Happen, a silhouetted performer uses arm position to sweep through RGB colors and trigger sound cues. It’s the latest effort to integrate the immersive media environment with a performer’s body, part spectacle, part interface.
The ingredients, apart from Microsoft’s ubiquitous Kinect depth camera:
Motion capture + image = light + sound
MadMapper [using MadMapper's Madlight feature to trigger lighting]
Music: Sun Glitters x Isan – Snowfall
The nice thing about the inter-linked, comment-enabled Web is, we get to see more.
Inverting the relationship of color to output, Things Happen uses a color turntable to trigger lights and sound. The colors themselves become a score for audiovisuals, a return to the days of optical discs and optical film tracks.
Enveloped by a sculptural projection of light, the band performed amidst a shimmering visual spectacle. Photo courtesy the artists; check out their Instagram.
Amidst a forest of illuminated tubing, the band How to Destroy Angels (with Trent Reznor, Atticus Ross, Mariqueen Maandig, and visualist/art director Rob Sheridan) take to the stage in an abstract digital structure. The stage image is transcendent, a rectangular prism of shifting digital visuals and projected effects. Perhaps it’s best to run this by the specs:
16 ft. tall (nearly 5m) curtain of surgical tubing
Data visualization is moving from the macroeconomic and large-scale – census numbers and such – to the personal. And digital work is getting more physical. So, it’s telling to look at this latest interaction design project from Copenhagen-based creator Andrew Spitz. The sound designer-turned-interaction designer built an app in Max/MSP that pulls travel information – entered manually or from TripIt – and outputs graceful arcs in a 3D-printed sculpture that acts as a tangible travelogue. (I’d actually love to see it go further, perhaps showing elevation with flight tracking or something, but the simple gesture here is nice.)
Max to me is a surprising choice, but it shows how capable that environment can be for those willing to push it.
In another meeting place between digital and tangible, Andrew is using hand-stamped elements to make the UI for Flying, an iPhone app that will link to loci with tracking information. I love this sort of technique – the kind of broad perspective that reminds you that, just because you’re working in digital media doesn’t mean you can’t mix non-digital media. Watch: Continue reading »