Project

Artificial listener with social intelligence

Copyright

Jin Joo Lee

Jin Joo Lee

A social robot modifies its behavior to change what you think about it!

In storytelling interactions, storytellers employ a certain but subtle social cueing strategy when they believe a listener is paying attention to their story. But, if they think the listener is not paying attention, then this strategy changes. It flips to one where the storyteller uses stronger cueing signals in hopes of getting the listener to pay attention again. I discovered these different strategies using machine learning models that find such patterns from real-world human interaction data.

Copyright

Jin Joo Lee

Using a Bayesian theory of mind (BToM) approach, my AI algorithm, represented as a dynamic Bayesian network (DBN), tracks what kind of strategy a storyteller is exhibiting to get a sense of what he/she currently thinks about a listening robot. Do they think the robot is being attentive or inattentive to their story? Based on this estimation, the robot then changes its body language to improve or even tone down its levels of expressivity because it has a goal of communicating an appropriate level of attentiveness. The social robot is in tune with what you think about it and modifies its behavior to alter your perception.

We integrated this AI algorithm into a realtime perception-to-behavior generation pipeline that controls a social robot. The software architecture supports the detection of prosodic- and gaze-based social cues from a child storyteller, decides how a social robot should respond based on its AI policy, and controls the exhibited set of expressive animations.

Copyright

Jin Joo Lee

People do this kind of “mind reading” to understand what others think about them, and its often considered to be common sense or even street smarts. A major factor in the success of robots will heavily depend on their ability to effectively communicate with us as socially intelligent agents.