Thanks for tuning in! For this upcoming project, Kyran and myself (Brian) will be endeavouring into the wonderful world of video tracking. Our goal is to create an interactive virtual instrument in which a single user will be able to play an ‘Imaginary Instrument’ as well as control an ‘Imaginary Orchestra’ using video tracking.
This will consist of three components:
1) The Imaginary Cello
This will consist of two components, a way of interpreting pitch and also a velocity control. Interpreting pitch will come in the form of imitating the way a normal cello is played, where moving up and down the neck will raise or lower the resultant pitch. For our Imaginary Cello we won’t have a finger board, rather, we will measuring the real time distance between Kyran’s left hand and the centre of his body. This will mean that as his left hand moves in space, either up or down, the distance will change and this distance can thusly be graduated and then mapped to a pitch set.
Naturally, we do not want the notes to be playing constantly so we will require a system for measuring/gating velocity. For this, we will be using the eZ430 Chronos, a watch like device that can output its Yaw, Pitch and Roll orientation. The watch can thusly be worn on the right hand and used to imitate the bow.
2) The Imaginary Orchestra
The Imaginary Cello will require only relative tracking information, meaning that (for example) when the Kinect is measuring Kyran’s left hand, it is doing so in relation to the body, not the performance space. This will mean that he can stand and move anywhere in the space and the ‘Cello’ will work the same. This then frees us up to consider another parameter for control, Kyran’s actual relationship to the whole space. The intent here is to create an Imaginary Orchestra in which the total performance space will be divided into a series of blocks, and that various instrumental tracks will be connected to each. As Kyran moves through the performance space, he will be able to have the accompaniment change dynamically around him.
This will be done again with the Kinect, this time however the system will measure his position within the total space. By using a series of ZMAP objects in max/msp we can take the total range of data being received by the Kinect and parse it into various segments. Because the Kinect is transmitting the X, Y and Z axis of tracking, we will be able to map different samples and effects to both the width and depth of the performance space. This parsed data will then be sent via OSC messages into an Ableton Live Set and be mapped to the various tracks.
3) Gesture Based Controls
In order to make the whole system dynamic, we will need a way of triggering mode changes. In the context of this project, a mode change would be something like, making the Cello active/inactive or in a similar vein, making any region used for triggering samples active/inactive. This will allow the entire system to be dynamic, variable and exist/transition between multiple states. The plan here will be to create a simple gestural recognition system that will allow Kyran to trigger different modes/effects. The easiest way to do this will be to detect and register hand crossings, combined with a sequential logic gating system. Essentially, detecting when hands cross, and then creating a system that will recognize a choreographed sequence of hand crossings. The reason for using a sequence will be to stop this occurring by accident and as well, this will mean we can create multiple sequences to be used to trigger different parameter changes.
So far things are very much in the development phase, but stay tuned! We’ll be updating this feed as this project progresses.