Michael Cragg's profileColin Roache's profile

Project Syncmaster

For video editors and motion designers, aligning video and animations to audio can be a tedious and manual experience. Syncmaster leverages Adobe Research's audio analysis technology to make audio data actionable, ultimately enabling them to be more expressive and efficient.

This project was shown at Adobe Max Sneaks 2017. While this prototype is not yet part of Creative Cloud, many Sneaks from previous years have later been incorporated into products. We’d love to get your thoughts and feedback on Syncmaster. 
One of the first things the Syncmaster team worked on was a refreshed visualization of the waveform. The waveform is split into three bands: low, mid, and high frequencies. Using the new waveform, its possible to be able to pick out individual instruments or notes in the music that you wouldn’t see before.
The expanded audio timeline automatically detects beats, events, and amplitude. From here, editors can select which elements they want to edit to. 
Syncmaster automatically finds and visualizes the unique and important parts in the music (such as notes or chords) using AI.
We found that audio data is a key driving force in the editing and syncing of videos and motion graphics. And people want to use this data to choreograph their visuals with an audio track's tempo, composition, dynamics, etc.
Example of binding your animation to the audio amplitude across time using dynamic keyframe ranges that synchronize to the audio data.
Project Syncmaster
Published:

Project Syncmaster

Syncmaster leverages Adobe Research's audio analysis technology to make audio data actionable, ultimately enabling motion graphics designers to b Read More

Published:

Creative Fields