Welcome to Oskelate, skeleton audio visualiser which uses Microsoft's Kinect hardware to facilitate more responsive music visualisations.
Oskelate is a skeleton music visualiser built in Pure Data (Pd) and Java. There are two core sets of functionality built into this system: elements which respond to audio input via the proprietary or external microphone of the operating computer and elements that respond to the performer's skeleton data via Microsoft/X-Box Kinect. By combining these two responses, Oskelate looks to provide musicians with ability to create a unique visual component to their performance that represents not only the audio qualities of their music but also the kinetic movement of the performer on stage.
Through enabling a more immersive visualisation of a musician's performance, the Oskelate system aims to encourage a higher emphasis on visual components within musical performance without the necessity for specialised skill-sets or software knowledge.
Before you start using Oskelate, you will need to plugin and position your Kinect within the performance space. There are a few considerations to take into account when doing this which heavily affect the tracking abilities of the Kinect. We have not discussed here the effects of variables such as lighting as these did not significantly impact the user testing studies.
The ideal position for tracking our guitarist Nathan is by placing the Kinect right in front of him. You will want to try and place the device around 2 meters away from the performer. This will allow the performer to move around without the Kinect losing losing sight.
If the user is near the front of the stage then placing the Kinect in front of them is not always possible. It isn't generally a good idea to place the Kinect to the side of the performer. The tracking is far more in consistent, however it will still work to a limited extent. Fortunately the Kinect can track from behind assuming it is able to see the selected skeleton joints of the user. However, if you are wanting to track the hands of the user then this may not be the best option as the performer generally has their hands in front of them.
Once the user has setup within the performance space, it is a good idea to calibrate the performer. While this is no longer necessary for OpenNI, we recommend holding the Psi pose (shown above) to establish user tracking. To do this, the user must hold their hands up so that they are equal with their head as Nathan is doing right now. The white registration dots you see projected behind him can be turned on and off using Oskelate's 'Calibration' mode.
Oskelate has been a project that has had a substantial period of development to get it to its current state, if you are interested in some of the processes that were followed click on the images to check out some of the other videos produced as part of this process.
Pd was eventually selected for its effective signal routing functionality. Test applications were produced from these Pure Data patches to test how intuitive the interface is to use, how stable the application is as well as performance parameters. A number of musicians were brought in during this process to help with the testing, where by playing their instruments in controlled environments, we were able to uncover the subtle eccentricities of tracking skeleton data with the Kinect. Issues we encountered included the early vertical layout of the interface, functionality issues with Pure Data's canvas moving unintentionally and the node functionality being unintuitive. These results and feedback provided during the sessions helped sculpt our end product.
First conceptualised by Ryan Achten for Victoria University's MDDN 412 paper, this project expanded to collaborate with Victoria University's Software Engineer (SWEN) students, Jason Pather, Joshi Shushruth and Simon Clark. Earlier iterations were produced in Processing and Pure Data (Pd), focusing primarily on gestural control using Microsoft's Kinect.