Publication

The affective remixer

Jaewoo Chung, Scott Vercoe

Abstract

This paper describes a real-time music-arranging system that reacts to immediate affective cues from a listener. Data was collected on the potential of certain musical dimensions to elicit change in a listener’s affective state using sound files created explicitly for the experiment through composition/production, segmentation, and re-assembly of music along these dimensions. Based on listener data, a probabilistic state transition model was developed to infer the listener’s current affective state. A second model was made that would select music segments and re-arrange (‘remix’) them to induce a target affective state. We propose that this approach provides a new perspective for characterizing musical preference.

Related Content