Howdy, y’all. This is George, a composer / performer at UBC and part of this year’s Laptop Orchestra (woo!). Last week, Annie and I worked on a piece using motion and colour tracking. In perfect conditions, the motion tracker acts as a sampler (in physical space!), allowing Annie (a dancer) to trigger fade ins and outs, as well as trigger and retrigger long- or short-form audio samples. The colour tracker (again, in perfect conditions) allows Annie to affect playback speed, playback pitch and delay intensity based on height and size of a given object – in this case, Annie used a red sock.
Because of the imperfections of the performance space (inconsistent lighting leading to visual noise in the colour tracker, people walking through the visual plane as we were setting up, etc.), the studio performance did not go perfectly, but the process of creating the environment was a worthwhile learning experience. In fact, working on this piece has inspired an audio-only, fixed media variation that I am currently working on.
As a fixed media composer and audio engineer, I feel like I’m out of my element in this dynamic world of “interactive” art; I spend hours automating aspects of recordings to ensure a worthwhile final product, and that just can’t be done here. However, I’m beginning to realize what needs to be done in order to make interactive art failsafe and foolproof while still allowing for a dynamic performance. Hopefully my remaining projects will reflect this increasing grasp 😉 .
Thanks for reading,
PS If you want to check out my fixed media endeavours, head over to https://soundcloud.com/thesunlitwoodsthemoonlitsea for some mainstream AND esoteric styles