Christopher Dye's profile

Duck Research Station (progress)

Duck Research Station (progress)
Do yourself a favor —full-screen this and make sure audio is on.
Objective:
Utilizing images collected from the ACE Field Research Facility (FRC) in Duck, NC, and meteorological data from the National Buoy Data Center (NBDC) create a time-lapse panorama that uses data driven sound to represent the meteorological conditions present between June 1, 2010–April 30, 2011.

How:
After sorting and compiling the image files with the help Max/MSP, images from three different cameras are compiled in After Effects to create a time-lapse panorama. 5,248 frames of the panorama are constructed from 15,744 images files.
 
Finally, Max/MSP/Jitter is used to parse data and analyze the images which trigger and generates the sounds. 

See below for more specifics on how these tools were used.

Challenges:

1. Camera Sync

After downloading thousands of images, I noticed that there were discrepancies in the number of image files between the three different cameras.  After initially compiling the panorama in After Effects these discrepancies resulted in the tryptic falling out of sync. Since there 15,750+ images, sorting through each file directory trying to spot the mismatches would be timely and nearly impossible.
Since the images are organized by camera name, and month, I created a custom Max patch to go through each camera's sub-directory, comparing the image file names against the corresponding files in the other camera's sub-directory. If a mismatch is found I am given the exact location of the problem.
3. Meteorological Data

Matching the meteorological data to the correct frame of video was the next hurdle.  Using another custom Max patch, I iterated through the list of image files—which happened to be named according to Year/Month/Date/Hour—and compared them to each line of data—which also began with Year/Month/Date/Hour.  If there was a match, the image file names and the corresponding data were linked.
4. Sound Representation

Finding the right balance of data, and the sound that best represents that data is going to be an ongoing process.  For this version I'm using a number of sound generators in Max/MSP/Jitter. 

I'm using a noise generator where the amplitude of the noise is proportional to the wave height.  This generator resembles the sound of  crashing waves. 

A cycle~ object is also used to represent wave height by producing 6 notes that correspond to the data values 0–6 meters. 

Swell period is represented through ring modulation where the modulation is proportional to the timing of the swell period (longer period=slower modulation, shorter period=faster modulation). 

Finally, I'm using the overall luma of each frame to drive the amplitude of another set of generators.  This gives the piece a nice rhythmic quality and represents the length and "brightness" of each day.
Duck Research Station (progress)
Published:

Duck Research Station (progress)

Visual experiments with ACE Field Research Center images. Work in progress.

Published: