Smell the Glove

How to wire up a glove, some machine vision software, and 5 strips of velcro to make a freespace musical synthesizer

Smell the Glove - Machine Vision - Invisible Synthesizer

For quite a while, I've been playing around with processing - a multimedia programming tool which seems to be more and more mature as time goes on. Yesterday, I finished a Processing prototype of a system which I've been thinking about, on and off, for more than a year.

I like blue

Bowerbird - Inspiration for 'I Like Blue'

It began with a simple idea for an emotional robot archivist called 'I Like Blue'. The robot would simply watch the world go by, but have a preference for certain kinds of things, and collect them like reminiscences, replaying them absent-mindedly to itself. I imagined it would be something like the video version of a Bowerbird which collects shiny things it likes and makes an enticing little house out of them. In the simplest case, it might just like a particular color, and might therefore collect video fragments of a person with a blue hat, a blue car going past, or a piece of sky which it particularly enjoyed. Visitors could look to see what the robot had liked that day, and maybe could even try to bring it things which it might like, and see if they could make it happy.

The Robot Orchestra

Curiosity Collective (TM) hardware! Thanks to Wildman

I'm a member of the Curiosity Collective which, thanks to a great plan from Wildman of Ipswich and his amazing circuit design skills, are collectively building a 'robot orchestra'.. I therefore got to thinking whether the simple color-matching or image matching behaviour which was considered for 'I Like Blue' might be used instead to generate MIDI signals which could drive the Robot Orchestra. After a good brainstorm in the last Curiosity meeting, we came up with something I could prototype.

Smell the Glove

As a result I spent an evening painting little velcro panels in different hues, distributed over the color spectrum, and attached them to the back of a black pair of gloves. I also wrote some software which counted up the pixels from a web camera which matched the different hues. As a result I created a kind of free-space virtual synthesizer, where you can play notes and chords in a pentatonic scale by lowering your individual fingers, as if you were playing an invisible keyboard.

You can begin with your hand parallel to the web camera, which is viewing the tips of your fingers head on. The more you press down your finger, the more of the color panel which is stuck on the back of the finger is revealed. When you let up your finger, the panel is pointing directly at the camera, and the color can no longer be seen. In software, an individual note is then associated with each color, hence creating the invisible keyboard. Each finger's movement generates its own tone.

At curiosity we put aside the glove and got PixelH8's ZX Spectrum to drive the light-sound synthesizer directly by generating colors on its screen, watched by the Macbook's webcam. We had these two machines - Macbook Pro and Spectrum, communing, and bridging about 30 years of computer history, with a two-line program which generated random background colors through the TV attached to the Spectrum.

Robot Controller Linkups

Initially I was just using it to generate single tones using the Minim audio library in Processing. In the end, I'll wire it through Wildman's Robot MIDI controller. I was able to demonstrate the prototype tonight at a Curiosity Meeting, where we also got Processing to drive the robot MIDI controller through ProMIDI, and Dave managed to trigger a Stella-Artois-bottle-and-fan soundmaker matching to the pressing of a single key on a MIDI keyboard. We also had a chance to construct a few more MIDI slave boards which can drive physical sound making devices.

Dave driving a Bottle Flute with a MIDI keyboard Creating the first Curiosity Collective (TM) hardware

Source Code for 'Smell the Glove'

I've got the source code for Smell the Glove below which should work straight off on a Macbook Pro. The code depends on the free Minim library being installed in your version of Processing. As for the MIDI Slave Controller - the main engineering so far on the Robot Orchestra project, Wildman promises to have his schematics up online soon - maybe this weekend - so the whole thing is expected to be public domain and open source.



        

        import java.awt.*;

import java.awt.event.*;



import processing.video.*;



import ddf.minim.*;

import ddf.minim.signals.*;



Capture camera;

int videowidth = 640;

int videoheight = 480;

float numpixels = videowidth * videoheight;

float inverse_pixels = 1.0/numpixels;



AudioOutput out;



color[] colors;

float[] levels;

SineWave[] waves;



int framerate = 30;



void setup(){

  Minim.start(this);

  out = Minim.getLineOut();  

  

  // C, E-flat, F, G, B-flat

  waves = new SineWave[]{

    new SineWave(261.63, 0.2, out.sampleRate()),

    new SineWave(311.13, 0.2, out.sampleRate()),

    new SineWave(349.23, 0.2, out.sampleRate()),

    new SineWave(392.00, 0.2, out.sampleRate()),

    new SineWave(466.16, 0.2, out.sampleRate())

  };

  

  for(int i = 0; i < waves.length; i++){

      out.addSignal(waves[i]);

  }



  colorMode(HSB, 1.0);

  camera = new Capture(this, videowidth, videoheight, framerate);

  colors = new color[]{color(0,0.9,0.9),color(0.2,0.9,0.9),color(0.4,0.9,0.9),color(0.6,0.9,0.9),color(0.8,0.9,0.9)};



  //general processing config

  size(videowidth, videoheight);

  frameRate(framerate);

          

  //draw initial stage

  background(0);

  

  createFullScreenKeyBindings();

  setResolution(videowidth, videoheight);

  setFullScreen(true);



}



void stop()

{

  // always close Minim audio classes when you are done with them

  out.close();

  super.stop();

}



void captureEvent(Capture camera)

{

  camera.read();

}



void draw(){

  image(camera, 0, 0);

  countPixels();

  drawFingers();

}



void drawFingers(){

  for(int i = 0; i < 5; i++){

    //draw panel box

    float step = 128;

    float x = i * step;

    float y = 0;

    float w = step;

    float h = 2 + (levels[i] * 512 );

    fill(colors[i]);

    rect(x, y,  w, h);    

  }

}



void countPixels(){

  levels = new float[]{0,0,0,0,0};

  int pos = 0, bucket = 0;

  float hueval = 0;

  loadPixels();

  try{

  for ( int x = 0; x < camera.width; x++){

    for ( int y = 0; y < camera.height; y++){

      pos = x + (y*camera.width);

      hueval = hue(camera.pixels[pos]);

      bucket = ((int)floor((hueval + 0.1) * 5)) % 5;

      levels[bucket] += inverse_pixels * brightness(camera.pixels[pos]) * saturation(camera.pixels[pos]);

    }

  }

  for(int i = 0; i < levels.length; i++){

    waves[i].setAmp(levels[i]);

  }

    

  }

  catch(RuntimeException e){

    print("Hueval:" + hueval + "Bucket:" + bucket);

    throw e;

  }

}



      

Tagged:

processing (4)

curiosity

robot orchestra

make