Open Access

Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback

EURASIP Journal on Advances in Signal Processing20042004:830184

https://doi.org/10.1155/S1110865704311182

Received: 30 June 2003

Published: 27 June 2004

Abstract

Electronic sound synthesis continues to offer huge potential possibilities for the creation of new musical instruments. The traditional approach is, however, seriously limited in that it incorporates only auditory feedback and it will typically make use of a sound synthesis model (e.g., additive, subtractive, wavetable, and sampling) that is inherently limited and very often nonintuitive to the musician. In a direct attempt to challenge these issues, this paper describes a system that provides tactile as well as acoustic feedback, with real-time synthesis that invokes a more intuitive response from players since it is based upon mass-spring physical modelling. Virtual instruments are set up via a graphical user interface in terms of the physical properties of basic well-understood sounding objects such as strings, membranes, and solids. These can be interconnected to form complex integrated structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, specified waveform, or from any external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. These aspects of the instrument are described along with the nature of the resulting acoustic output.

Keywords

physical modellingmusic synthesishaptic interfaceforce feedbackgestural control

Authors’ Affiliations

(1)
Media Engineering Research Group, Department of Electronics, University of York

Copyright

© Howard and Rimell 2004