Isadora Experiments – Triggering and manipulating audio from live video
Recently I’ve had the chance to experiment with Isadora by Troikatronix in using live video to trigger and manipulate audio for use in performing arts performance for triggering multiple sounds from multiple cameras. The intenton is to either use dancers or simply members of the public in the exhibition to trigger audio from their movements.
The following patch you can see takes use of several “actors” available in Isadora to route a single camera signal (video in watcher) and analyse it (eyes).
Of the many parameters you can watch using eyes the main ones I’ve used include “hit column” and “hit row” for simply tracking a the main object of any given frame of live streamed video. Corelating these to a grid gives a numbered intersection I’ve been able to use to trigger other sounds.
However smoothening this every changing series of digits is essential as to not give off hug he fluctuations in values and subsequently triggered audio.
Passing after this section the final state involves setting several “inside range” actors to watch for separate bandwidths of values at different ranges. Thus enabling me to trigger sounds for objects generally high. low, left or right of live streamed shot. This passes each of these ranges through to separate audio file triggers which are enabled when the value reaches that between the values set by the actor “inside range”.
Further exploration of manipulation can be seen with using “obj size” in eyes to manipulate the volume of the triggered sounds via the size of the object tracked on screen. Like wise the objects vertical position has been used to govern the speed at which the triggered audio is played.
All in all this has led to some very experimental soundscapes to be utilised live during a performing arts event here in Birmingham. See below for a video of one of the first experiments in using an in built webcam to provide the streaming video content to trigger the samples. More developments soon!