MIT Media Lab, E14-633
Emotion is key to the effectiveness of media, whether it be in influencing memory, likability or persuasion. However, the understanding of the role of emotions in media effectiveness has been limited due to the difficulty in measuring emotions in real-life contexts. The lack of understanding of the effects of emotion in media results in large amounts of wasted time, money, and other resources; in this thesis Daniel McDuff presents the first large-scale emotion measurement studies on video advertising, political debates and movies.
Facial expressions, heart rate, respiration rate and heart rate variability can inform us about the emotional valence, arousal and engagement of a person. In this thesis McDuff demonstrates how automatically detected naturalistic and spontaneous facial responses and physiological responses can be used to predict the effectiveness of media. Furthermore, this can be performed remotely using low-cost camera sensors.
McDuff presents a framework for automatically measuring facial and physiological responses in addition to self-report and behavioral measures to content (e.g., video advertisements) over the Internet in order to understand the role of emotions in media effectiveness. Specifically, he will present analysis of the first large scale data of facial, physiological, behavioral and self-report responses to video content collected "in-the-wild" using the cloud. The data include over 20,000 video responses from thousands of individuals. McDuff has developed models for evaluating the effectiveness of media (likability, persuasion, and short-term sales impact) based on the automatically extracted features. This work shows success in predicting measures of media effectiveness that are useful in creation of content whether that be in copy-testing or content development.
Host/Chair: Rosalind W. Picard
Jeffrey Cohn, Ashish Kapoor, Thales Teixeira