Towards multisensory control of physical modeling synthesis

Communications dans un congrès

Auteurs : Loïc Jankowiak, Han Han, Vincent Lostanlen, Mathieu Lagrange.

Conférence : International Congress & Exposition on Noise Control Engineering (Inter-Noise)

Date de publication : 2024

Lien vers le dépot HAL

Abstract


Physical models of musical instruments offer an interesting tradeoff between computational efficiency and perceptual fidelity. Yet, they depend on a multidimensional space of user-defined parameters whose exploration by trial and error is impractical. Our article addresses this issue by combining two ideas: query by example and gestural control. On one hand, we train a deep neural network to identify the resonator parameters of a percussion synthesizer from a single audio example via an original method named perceptual–-neural–-physical sound matching (PNP). On the other hand, we map these parameters to knobs in a digital controller and configure a musical touchpad with MIDI polyphonic expression. Hence, we propose a multisensory interface between human and machine: it integrates haptic and sonic information and produces new sounds in real time as well as visual feedback on the percussive touchpad. We demonstrate the interest of this new kind of multisensory control via a musical game in which participants collaborate with the machine in order to imitate the sound of an unknown percussive instrument as quickly as possible. Our findings show the challenge and promise of future research in musical "Human–AI parternships".