“…but I do have a Microsoft Kinect, Mac Book Pro, Ableton Live 9, Max/MSP, Kontakt 5, Synapse and an eZ430-Chronos watch…”
Kyran and myself (Brian) testing out our virtual cello and dynamic backing track. This piece makes use of the Microsoft Kinect to track various aspects of Kyran’s position and motion within the performance space. Kyran is also wearing an eZ430-Chronos watch which transmits various data about his hand position and motion. This information is combined in Max/MSP to create a ‘virtual cello,’ where the left hand position is determining the pitch to be played, and motion in the right hand actually triggers the sound to be played. As well, Kyran is able to change the performance mode of the instrument (legato, pizz, and trem) by putting his left hand over his head.
In conjunction, the Kinect is tracking Kyran’s position in the performance space and using that to alter a backing track. For this, we are using the ‘Nodes’ interface in Max/MSP to place different sound files/objects in the space. These objects are given a radius and when Kyran moves into that radius the sound files become active and start playing back. The closer he gets to them, the louder they play and he is able to create a dynamic backing track simply from moving through the performance space.
All of this information is collected and routed in Max/MSP and then networked to Ableton via OSC messaging. This allows us to use higher end sample libraries for the sounds of each instrument, for example, our virtual cello is Native Instruments ‘Session Strings’ Solo Cello, being loaded into Kontakt 5.
I think thats everything…for now