Leslie, Grace & Picard, Rosalind & Lui, Simon. (2015). An EEG and Motion Capture Based Expressive Music Interface for Affective Neurofeedback. 10.13140/RG.2.1.4378.6081.
Work for a Member company and need a Member Portal account? Register here with your company email address.
Leslie, Grace & Picard, Rosalind & Lui, Simon. (2015). An EEG and Motion Capture Based Expressive Music Interface for Affective Neurofeedback. 10.13140/RG.2.1.4378.6081.
This project examines how the expression granted by new musical interfaces can be harnessed to create positive changes in health and well-being. We are undergoing experiments to measure EEG dynamics and physical movements performed by participants who are using software designed to invite physical and musical expression of the basic emotions. The present demonstration of this system incorporates an expressive gesture sonification system using a Leap Motion device, paired with an ambient music engine controlled by EEG-based affective indices. Our intention is to better understand affective engagement, by creating both a new musical interface to invite it, and a method to measure and monitor it. We are exploring the use of this device and protocol in therapeutic settings in which mood recognition and regulation is a primary goal. 1 Précis Current methods of inviting emotional response in a laboratory environment rely on passive media, such as images from the International Affective Picture System (IAPS), which are chosen to incite one of a few basic emotions when displayed to an experiment participant. However, full affective engagement may require active participation, and physical expression is one avenue by which a participant may be invited to actively experience an emotion. We are demonstrating a new musical interface that maps expressive gesture to sound, designed to invite participants to affectively engage in the basic emotions. In its present form, the interface consists of a Leap Motion [1] device that tracks repeated small expressive gestures made by a participant's hand. Concatenative synthesis software [2] translates these gestures into an expressive wash of sound by mapping the first and second principal components of the recorded movement to the computed spectral centroid and periodicity of each sound segment selected for playback. A dry-electrode EEG system [3] records EEG as the participant performs gestures corresponding to a basic emotion (anger, grief, joy, etc.). The artifact subspace reconstruction (ASR) method, as implemented in the