There has been increasing research interest in real-time performance control of sound synthesis as the variety of hardware available to mediate this control has expanded, chiefly through changes in video game and mobile device technology and the arrival of programmable Open Source Hardware systems. These devices offer a range of new interactions with real-time synthesis software, such as multi-touch surfaces, tangible objects, orientation and motion capture (i.e. Microsoft Kinect), which can be combined with more traditional sensors such as switches, dials, faders, etc.
Multi-Modal Instrument (MMI) is a developing platform for experimentation with controller mapping in a physical (modal) modelling synthesis setting, allowing a range of contemporary and traditional affordable controllers to be combined and applied to synthesis model parameters. This work-in-progress will enable a set of evaluation experiments to be carried out, exploring approaches to optimising device mappings for efficient, accurate and creative performance.
- Richard Polfreman