Research

Multimodal Interaction and Communication

Creating a robot that can integrate information from different sources and modalities

Research Unit: 1

Project Number: 9

Example Behavior:
Social Intelligence

Disciplines:
Psychology
Robotics

 

External Collaborators:
Doris Pischedda
Murat Kirtay
Anna Kuhlen

 

Expected Project Duration
2019 - 2023


← Projects Overview

Multimodal Interaction and Communication

Creating a robot that can integrate information from different sources and modalities

©SCIoI

The overall goal of this project is to create a robot that can represent and integrate information from different sources and modalities for successful, task-oriented interactions with other agents. To fully understand the mechanisms of social interaction and communication in humans and to replicate this complex human skill in technological artifacts, we must provide effective means of knowledge transfer between agents. The first step of this project is therefore to describe core components and determinants of communicative behavior including joint attention, partner co-representation, information processing from different modalities and the role of motivation and personal relevance (Kaplan, and Hafner, 2006; Kuhlen & Abdel Rahman, 2017; Kuhlen et al., 2017). We will compare these functions in human-human, human-robot, and robot-robot interactions to identify commonalities and differences. This comparison will also consider the role of different presumed partner attributes (e.g., a robot described as “social” or “intelligent”). We will conduct behavioral, electrophysiological, and fMRI experiments to describe the microstructure of communicative behavior.The second step of the project is to create predictive models for multimodal communication that can account for these psychological findings in humans. Both the prerequisites and factors acting as priors will be identified, and suitable computational models will be developed that can represent multimodal sensory features in an abstract but biologically inspired way (suitable for extracting principles of intelligence; Schillaci et al., 2013). In perspective, the third step of this project is to use these models to generate novel predictions of social behavior in humans. Throughout the project we will focus on the processing of complex multimodal information, a central characteristic of social interactions, that have nevertheless thus far been investigated mostly within modalities. We assume that multimodal information, e.g. from auditory (speech) and visual (face, eye gaze) or tactile (touch) information, will augment the partner co-representation and will therefore improve communicative behavior.


Yun, H. S., Taliaronak, V., Kirtay, M., Chevelère, J., Hübert, H., Hafner, V. V., Pinkwart, N., & Lazarides, R. (2022). Challenges in Designing Teacher Robots with Motivation Based Gestures. 17th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022). https://arxiv.org/abs/2302.03942
Yun, H. S., Hübert, H., Taliaronak, V., Mayet, R., Kirtay, M., Hafner, V. V., & Pinkwart, N. (2022). AI-based Open-Source Gesture Retargeting to a Humanoid Teaching Robot. AIED 2022: The 23rd International Conference on Artificial Intelligence in Education. https://link.springer.com/chapter/10.1007/978-3-031-11647-6_51
Wudarczyk, O. A., Kirtay, M., Pischedda, D., Hafner, V. V., Haynes, J.-D., Kuhlen, A. K., & Abdel Rahman, R. (2021). Robots facilitate human language production. Scientific Reports, 11(1), 16737. https://doi.org/10.1038/s41598-021-95645-9
Wudarczyk, O. A., Kirtay, M., Kuhlen, A. K., Abdel Rahman, R., Haynes, J.-D., Hafner, V. V., & Pischedda, D. (2021). Bringing Together Robotics, Neuroscience, and Psychology: Lessons Learned From an Interdisciplinary Project. Frontiers in Human Neuroscience, 15. https://doi.org/10.3389/fnhum.2021.630789
Spatola, N., & Wudarczyk, O. (2021). Ascribing emotions to robots: Explicit and implicit attribution of emotions and perceived robot anthropomorphism. Computers in Human Behavior, 124, 106934. https://doi.org/10.1016/j.chb.2021.106934
Spatola, N., & Wudarczyk, O. A. (2020). Implicit Attitudes Towards Robots Predict Explicit Attitudes, Semantic Distance Between Robots and Humans, Anthropomorphism, and Prosocial Behavior: From Attitudes to Human–Robot Interaction. International Journal of Social Robotics. https://doi.org/10.1007/s12369-020-00701-5
Pischedda, D., Erener, S., Kuhlen, A., & Haynes, J.-D. (2023). How do people discriminate conversations generated by humans and artificial intelligence? The role of individual variability on people’s judgment. ESCOP 2023.
Pischedda, D., Lange, A., Kirtay, M., Wudarczyk, O. A., Abdel Rahman, R., Hafner, V. V., Kuhlen, A. K., & Haynes, J.-D. (2021). Who is my interlocutor? Partner-specific neural representations during communicative interactions with human or artificial task partners. 5th Virtual Social Interactions (VSI) Conference.
Pischedda, D., Lange, A., Kirtay, M., Wudarczyk, O. A., Abdel Rahman, R., Hafner, V. V., Kuhlen, A. K., & Haynes, J.-D. (2021). Am I speaking to a human, a robot, or a computer? Neural representations of task partners in communicative interactions with humans or artificial agents. Neuroscience 2021.
Pischedda, D., Kaufmann, V., Wudarczyk, O., Abdel Rahman, R., Hafner, V. V., Kuhlen, A., & Haynes, J.-D. (2023). Human or AI? The brain knows it! A brain-based Turing Test to discriminate between human and artificial agents. RO-MAN 2023. https://doi.org/10.1109/RO-MAN57019.2023.10309541
Kirtay, M., Hafner, V. V., Asada, M., & Oztop, E. (2023). Trust in robot-robot scaffolding. IEEE Transactions on Cognitive and Developmental Systems. https://doi.org/10.1109/TCDS.2023.3235974
Kirtay, M., Oztop, E., Kuhlen, A. K., Asada, M., & Hafner, V. V. (2022). Trustworthiness assessment in multimodal human-robot interaction based on cognitive load. 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 469–476. https://doi.org/10.1109/RO-MAN53752.2022.9900730
Kirtay, M., Oztop, E., Asada, M., & Hafner, V. V. (2021). Modeling robot trust based on emergent emotion in an interactive task. 2021 IEEE International Conference on Development and Learning (ICDL), 1–8. https://doi.org/10.1109/ICDL49984.2021.9515645
Kirtay, M., Chevalère, J., Lazarides, R., & Hafner, V. V. (2021). Learning in Social Interaction: Perspectives from Psychology and Robotics. 2021 IEEE International Conference on Development and Learning (ICDL), 1–8. https://doi.org/10.1109/ICDL49984.2021.9515648
Kirtay, M., Wudarczyk, O. A., Pischedda, D., Kuhlen, A. K., Abdel Rahman, R., Haynes, J.-D., & Hafner, V. V. (2020). Modeling robot co-representation: state-of-the-art, open issues, and predictive learning as a possible framework. 2020 Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob), 1–8. https://doi.org/10.1109/ICDL-EpiRob48136.2020.9278031
Kirtay, M., Oztop, E., Kuhlen, A. K., Asada, M., & Hafner, V. V. (2022). Forming robot trust in heterogeneous agents during a multimodal interactive game. 2022 IEEE International Conference on Development and Learning (ICDL), 307–313. https://doi.org/10.1109/ICDL53763.2022.9962212
Kirtay, M., Oztop, E., Asada, M., & Hafner, V. V. (2021). Trust me! I am a robot: an affective computational account of scaffolding in robot-robot interaction. 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 189–196. https://doi.org/10.1109/RO-MAN50785.2021.9515494
Eiserbeck, A., Wudarczyk, O., Kuhlen, A., Hafner, V., Haynes, J.-D., & Rasha, A. R. (2024). Communicative context enhances emotional word processing with human speakers but not with robots. ASSC27.
Chevalère, J., Kirtay, M., Hafner, V., & Lazarides, R. (2022). Who to Observe and Imitate in Humans and Robots: The Importance of Motivational Factors. International Journal of Social Robotics. https://doi.org/10.1007/s12369-022-00923-9

Research

An overview of our scientific work

See our Research Projects