Written by: Eli McIlveen
Edited by:
The Cambridge Arts Festival is a single day event for emerging and established artists to present their art to the community. As well as local art & crafts and a big music & dance lineup, the event has taken on a DIY and “Maker” bent in recent years. We were asked to develop a series of interactive exhibits for the children's area, and we jumped at the chance to take part.

Conceived and developed by Eli McIlveenThe Well uses a Microsoft Kinect motion controller to project a 3D image of a visitor, allowing them to push and pull a virtual surface simply by moving around in front of the screen. The visuals are accompanied by synthesized sound under MIDI control. Dave Addison created the background visuals and provided knowledge about the MIDI protocol.

The exhibit grew out of a 3D model of a vibrating drumhead, itself an extension of an earlier simulation we did of string motion in musical instruments. The nodes in The Well model are represented by a square grid, simply ignoring those nodes that fall outside the diameter of the disc.
 
Integrating Kinect depth data: Once we chose the Kinect as a method of interaction, the biggest challenge was to integrate its depth data into the simulation.
 
Next, we had to tame the input to prevent the often-drastic changes in depth readings from throwing the surface into chaos. To do this, we damped down the motion of the nodes, put an upper bound on their velocity, and added in a factor to make the surface “adhere” to the user.
 
Sound: To further enliven the experience we wanted to add generated sound. Synthesizing audio on the fly in Processing proved too slow, and simple playback of pre-generated files wasn't suited to creating the constant, continuously evolving sound we were after. Instead, we investigated MIDI control, allowing Processing to control external synths.
 
We chose the REAPER digital audio workstation due to its low cost and low processor overhead, running a freely available analog modelling synth plugin called TyrellN6, and created custom patches in conjunction with reverb and comb filtering to generate a “wind tunnel” sound.
 
Finally, to animate the exhibit during idle periods, we also introduced a subtle “water drip” that perturbs a random point on the surface from time to time. An echoing drip sound, created using filtered noise and reverb, helps to “sell” the effect.
 
How it went: Kids gravitated to the Well immediately (it helped that we positioned it close to the floor so even our youngest visitors could take a turn).
 
Challenges: The inefficiency of Processing code turned out to be a major bottleneck. As a result, the simulation had to be run at a lower resolution than we had hoped, both in terms of screen size (720p) and the detail of the membrane. This version of the Well is 85 nodes in diameter, using only a fraction of the data available from the Kinect. To improve the efficiency of the membrane model, we tried an old trick used in programming cellular automata, using a one-dimensional array rather than the more straightforward 2D array. This required rewriting a good deal of the model’s code, but unfortunately didn't up our frame rates by much.
 
For our next project of this type we're planning to go native, most likely using C++ with a library such as Cinder or OpenFrameworks.
 
Many thanks to the Cambridge Arts Festival volunteers who helped us with setup and teardown, and to CAF’s organizer, the indefatigable Gareth Carr, for inviting us to take part.

 
Derived from: iStock_000010890402
Photographs from the Cambridge Arts Festival 2013
Taken by: Matt Mollon
CAF - The Well
Published:

CAF - The Well

The Well uses a Microsoft Kinect motion controller to project a 3D image of a visitor, allowing them to push and pull a virtual surface simply by Read More

Published: