Facial expression as a way to comprehend social interaction in mice
Research Unit 1, SCIoI Project 25
In social interactions, information processing is known to be strongly biased by emotional states, contexts and expectations. Body postures and facial expressions are a rich source of information about the state of mind, intentions, and even more enduring personality traits of communication partners. The aim of this project is to investigate behavioral, neurocognitive, and computational aspects of the interaction between learning and social perception. We propose that there is a complex and bi-directional interplay between emotions and cognition in the service of efficient information processing and communication. The interplay between perceivable features and observer-internal sources of information enables an agent to quickly draw inferences about the mental states, emotions, and intentions of others. This led to the hypothesis that emotional behavior represents a shortcut to bypass more costly cognitive processes in the reduction of dimensionality. For example, a negative emotional reaction of a conspecific can serve as a simple clue for avoidance learning while positive emotions will facilitate learning and delay giving up latency in trial and error learning.
In this project, we focus on mice as an animal model in order to gain an understanding of how social responsiveness is facilitating learning and how it is modulated by priors and emotions.
The empirical description of how learning in social interaction is realized in mice will
form the basis for computational modeling of subjective perception to bridge the gap between current, purely bottom-up driven models to integrate the top-down influences of subjective aspects in synthetic agents. In perspective, social robots may be equipped with perceptual biases, making social interactions more natural. Hence, social robots will be enabled to behave more sensitively and more congenial towards their biological partners. The loop from analytic to synthetic and back will be closed by what we learn for mouse communication from creating robots capable of dimension reduction via emotion recognition. We envision a system (simulator of facial expression and body posture/movement of a mouse) where basal emotional communication can be established between experimenters and mice using robotic mice capable of facial expressions. Such a device would allow to further test hypotheses on the interaction of emotions and cognition.
The data derived in this project will serve as a basis for an in depth analysis of if and
how our model organism exhibits intelligent behavior. By comparing mice to other species like rats and cockatoos, human subjects, and artificial agents we will gain a far better understanding on intelligence and especially on different grades of intelligent behavior.