Oskelate - Skeleton Visualiser
Welcome to Oskelate, skeleton audio visualiser which uses Microsoft's Kinect hardware to facilitate more responsive music visualisations.

Oskelate is a skeleton music visualiser built in Pure Data (Pd) and Java. There are two core sets of functionality built into this system: elements which respond to audio input via the proprietary or external microphone of the operating computer and elements that respond to the performer's skeleton data via Microsoft/X-Box Kinect. By combining these two responses, Oskelate looks to provide musicians with ability to create a unique visual component to their performance that represents not only the audio qualities of their music but also the kinetic movement of the performer on stage.

Through enabling a more immersive visualisation of a musician's performance, the Oskelate system aims to encourage a higher emphasis on visual components within musical performance without the necessity for specialised skill-sets or software knowledge.
Video Module: Luma Offset

The luma offset module redistributes the pixels of the source footage by their luminosity values according to the setting defined by the user. The spacing between the pixel distribution and banding can be controlled by either audio or skeleton response. The fill closing the holes between the pixel spacing and smoothing can be turned off and on.
Video Module: Refraction

The refraction module divides the source signal into cells to give an appearance similar to glass refraction. The magnitude of refraction, height and width of these cells can be controlled by either audio or skeleton response while image magnification and be toggled on and off.
Video Module: Kalei

The kalei module creates a number of segments from the centre of the source video. The number of the segments (magnitude), the rotation of the video input and video output can be controlled using audio or skeleton response.
Texture Module: TexCube

The video filter is rendered here inside of a cube. The viewing angle can be rotated and the scale of the cube can each be adjusted along an XYZ axis in response to either audio or skeleton input.
Texture Module: OskWave

The video is rendered onto a surface moving along a waveform. The shape of this waveform can be selected using the graphic user interface. The force, amount of noise and height of the waveform can be controlled using audio or skeleton response. The images shown here are only a small sample of what can be achieved with this module.
Before you start using Oskelate, you will need to plugin and position your Kinect within the performance space. There are a few considerations to take into account when doing this which heavily affect the tracking abilities of the Kinect. We have not discussed here the effects of variables such as lighting as these did not significantly impact the user testing studies.
The ideal position for tracking our guitarist Nathan is by placing the Kinect right in front of him. You will want to try and place the device around 2 meters away from the performer. This will allow the performer to move around without the Kinect losing losing sight.
If the user is near the front of the stage then placing the Kinect in front of them is not always possible. It isn't generally a good idea to place the Kinect to the side of the performer. The tracking is far more in consistent, however it will still work to a limited extent. Fortunately the Kinect can track from behind assuming it is able to see the selected skeleton joints of the user. However, if you are wanting to track the hands of the user then this may not be the best option as the performer generally has their hands in front of them.
Once the user has setup within the performance space, it is a good idea to calibrate the performer. While this is no longer necessary for OpenNI, we recommend holding the Psi pose (shown above) to establish user tracking. To do this, the user must hold their hands up so that they are equal with their head as Nathan is doing right now. The white registration dots you see projected behind him can be turned on and off using Oskelate's 'Calibration' mode.
Oskelate has been a project that has had a substantial period of development to get it to its current state, if you are interested in some of the processes that were followed click on the images to check out some of the other videos produced as part of this process.
Pd was eventually selected for its effective signal routing functionality. Test applications were produced from these Pure Data patches to test how intuitive the interface is to use, how stable the application is as well as performance parameters. A number of musicians were brought in during this process to help with the testing, where by playing their instruments in controlled environments, we were able to uncover the subtle eccentricities of tracking skeleton data with the Kinect. Issues we encountered included the early vertical layout of the interface, functionality issues with Pure Data's canvas moving unintentionally and the node functionality being unintuitive. These results and feedback provided during the sessions helped sculpt our end product.
First conceptualised by Ryan Achten for Victoria University's MDDN 412 paper, this project expanded to collaborate with Victoria University's Software Engineer (SWEN) students, Jason Pather, Joshi Shushruth and Simon Clark. Earlier iterations were produced in Processing and Pure Data (Pd), focusing primarily on gestural control using Microsoft's Kinect.

Oskelate - Skeleton Visualiser
34
401
0
Published:

Oskelate - Skeleton Visualiser

Oskelate is a skeleton music visualiser built in Pure Data (Pd) and Java. Oskelate looks to provide musicians with ability to create a unique vis Read More
34
401
0
Published: