SurfaceMusic is a tabletop music system in which touch gestures are mapped to physical models of instruments. With physical models, parametric control over the sound allows for a more natural interaction be- tween gesture and sound. We discuss the design and implementation of a simple gestural interface for interacting with virtual instruments and a messaging system that conveys gesture data to the audio system.
|Lawrence Fyfe, Sean Lynch, Carmen Hull and Sheelagh Carpendale. SurfaceMusic: Mapping Virtual Touch-based Instruments to Physical Models. In Proceedings of the Conference on New Interfaces for Musical Expression, pages 360-363, 2010.|