P(l)aying Attention: Multi-Modal, Multi-Temporal Music Control

in New Interfaces for Musical Expression (NIME) 2020, Conference paper (text), Birmingham, UK (online)

Abstract

The expressive control of sound and music through body movements is well-studied. For some people, body movement is demanding, and although they would prefer to express themselves freely using gestural control, they are unable to use such interfaces without difficulty. In this paper, we present the P(l)aying Attention framework for manipulating recorded music to support these people, and to help the therapists that work with them. The aim is to facilitate body awareness, exploration, and expressivity by allowing the manipulation of a pre-recorded 'ensemble' through an interpretation of body movement, provided by a machinelearning system trained on physiotherapist assessments and movement data from people with chronic pain. The system considers the nature of a person's movement (e.g. protective) and offers an interpretation in terms of the joint-groups that are playing a major role in the determination at that point in the movement, and to which attention should perhaps be given (or the opposite at the user's discretion). Using music to convey the interpretation offers informational (through movement sonification) and creative (through manipulating the ensemble by movement) possibilities. The approach offers the opportunity to explore movement and music at multiple timescales and under varying musical aesthetics.