Motion Tracking: David Rokeby’s “Very Nervous System”

October 8, 2008 at 11:12 am (Further Research & Contextualisation)

I wanted to look further into the possibilities of translating movement into sound using motion tracking via a video camera. I came across this work by interactive artist David Rokeby:

 I’m actually pretty blown away by the amount of gestural control he seems to have over the sound! It appears as though the music has some kind of linearity that only occurs when movement is detected. Individual sounds (or clusters of sounds) seem to be positioned at different points in the space and can be triggered when the body moves into that space; hence (in the second video) he repeats movements in a certain space, and the same sound occurs. The velocity at which the movement travels seems to determine the speed at which the sequence of sounds are played through – notice the big sweeping motions making large gestural sweeps with the sounds. However, I may be (completely) wrong! I will research further into this.

 *****************************

I was wrong (in part)! As Rokeby puts it:

“The installation is a complex but quick feedback loop. The feedback is not simply ‘negative’ or ‘positive’, inhibitory or reinforcing; the loop is subject to constant transformation as the elements, human and computer, change in response to each other. The two interpenetrate, until the notion of control is lost and the relationship becomes encounter and involvement.” (http://homepage.mac.com/davidrokeby/vns.html)

Rokeby is describing a kind of intelligent system that alters the way it behaves as the user interacts with it more. I would like to find out in more detail how he implented this feedback mechanism. What amazes me is that this piece was created in 1986. The man is obviously a genius! Seeing this is really encouraging, as it highlights the possibility of me achieving (with enough work) the translation of movement through a space into sonic experience. I must research this further through practical work in MaxMSP and Jitter. I’m excited!

The theoretical concerns regarding user interaction are certainly interesting to me. I plan to research into this further as I get deeper into motion tracking. Through my practical work, I’ve decided that I want my interface to be hidden and intuitive like Rokeby’s. However, this is not completely set in stone. As I begin unlock the possibilities of motion tracking and MaxMSP and Jitter I may discover another solution.

In Very Nervous System, Rokeby’s reasoning for creating a dynamic, hidden and intuitive interface stems from a need to counteract the language of the computer:

“Because the computer is purely logical, the language of interaction should strive to be intuitive. Because the computer removes you from your body, the body should be strongly engaged. Because the computer’s activity takes place on the tiny playing fields of integrated circuits, the encounter with the computer should take place in human-scaled physical space. Because the computer is objective and disinterested, the experience should be intimate.” (http://homepage.mac.com/davidrokeby/vns.html)

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: