[ ]

[  ] 

By Josh Becker '19, An-Ya Olson '22 & Matt Wang '19

[  ] is a sound installation that allows the user to explore sound through physical space. We use motion capture of both the user and an independent object—a box. A number of parameters are mapped to the user’s limbs, as well as the box, some independent and others coupled such that the mapping is opaque yet deterministic, so it can be a compelling space to explore.

In order to implement our mapping, we used a machine learning system written by Rebecca Fiebrink, the Wekinator. The Wekinator allowed us to set audio parameters and easily train a machine learning model to map those audio parameters to our spatial parameters (sent over the network as OSC).

In our initial testing of the Wekinator with a drum loop as the sound input, we noticed that the Wekinator tended to produce chaotic sounds, so for our final product, we wanted to choose sounds that would contrast with this. We selected three different audio samples to combine as inputs to our final Wekinator mappings: soft rain, Swedish birds, and harp music. 

To explore [  ], put the motion tracking wearables in the following way:

  • Pink: Left wrist
  • Orange: Right wrist
  • Yellow/green: Left ankle
  • Clear: Right ankle

Harp recording provided by:
https://freesound.org/people/RepDac3/sounds/348655/
Swedish Bird recording provided by:
https://freesound.org/people/straget/sounds/415646/
Kawaii Rain recording provided by:
https://freesound.org/people/inchadney/sounds/104390/