Visual Instrument


Visual Instrument

This application turns touches into physics-based interactions; the collisions that result create sounds based on where they hit and impact velocity. The result is a soothing and endless melodic tune that will never repeat itself. The visuals provide an ambient backdrop to help you play. The capsules represent your touch, each moving dot is a physics-enable ball that contains a note. Hit a ball, and the note plays. If the ball hits the edges of the screen, the collision also triggers the note.

Jazz guitar virtuoso Paul Musso provided guitar recordings of chords typically used to create a melody – F#, D#, C and G. Finger touches (supports up to 4 simultaneous touches) creates collisions that will trigger one of these chords. Every 3 seconds, the key is randomly changed to either A Minor, C, F or G.

This project was also installed as an Installation using kinect-activated sensor activity.

Available on the Apple App Store and Google Play.

media: computer program for mobile devices, Android and iOS, completed: 2012

Instrument Controller

The inspiration for the app came from an experimental synthesizer interface that I designed for the iPad and iPhone, allowing the user to control the synth using a touch interface. The visuals are manipulated by the same interface controls so that a user can play the synthesizer visually, rather than through looking at a control surface. As the tube narrows, for example, the synthesizers chorus has correspondingly been narrowed. Colors change, both in the background and the object based on frequency amplitude changes. This piece was performed at the Denver BYOB (Bring your own Beamer in 2011).

The programs used include MRMR which uses the OSC protocol to send Midi signals to Reason via Max/MSP/Jitter which is also running the visuals.


media: iPhone/iPad, LCD projection, monitors, 2-channel sound; dimensions: variable, completed: 2011