Using sound in multi-touch interfaces to change materiality and touch behavior
Current development in multimodal interfaces allows us to interact with digitally represented objects. Sadly, these representations are often poor due to technical limitations in representing some of the sensorial properties. Here we explore the possibility of overcoming these limitations by exploiting multisensory integration processes and propose a sound-based interaction technique to alter the perceived materiality of a surface being touched and to shape users' touch behavior. The latter can be seen both as a cue of, and as a means to reinforce, the altered perception. We designed a prototype that dynamically alters the texture-related sound feedback based on touch behavior, as in natural surface touch interactions. A user study showed that the frequency of the sound feedback alters texture perception (coldness and material type) and touch behavior (velocity and pressure). We conclude by discussing lessons learnt from this work in terms of HCI applications and questions opened by this research. Copyright is held by the owner/author(s).