Research

From understanding learners’ adaptive motivation and emotion to designing social learning companions

Developing an approach integrating game- and agent-based Intelligent Tutoring Systems (ITS) and computational models to help optimize social learning situations

Research Unit: 1

Project Number: 6

Example Behavior:
Social Intelligence

Disciplines:
Computer Science
Education Science
Robotics

 

Doctoral Researchers:
Anja Henke
Hae Seon Yun

Postdoctoral Researchers:
Johann Youri Chevalère

External Collaborators:
Heiko Hübert

 

Expected Project Duration
2019 - 2024


← Projects Overview

From understanding learners’ adaptive motivation and emotion to designing social learning companions

Developing an approach integrating game- and agent-based Intelligent Tutoring Systems (ITS) and computational models to help optimize social learning situations

©SCIoI

In this threefold project, we will firstly examine how novel user modelling approaches and feedback strategies in Intelligent Tutoring Systems (ITS) incorporating virtual agents can enhance positive emotions and motivation (self-regulation, goal orientations) and reduce negative emotions in social learning situations and how these approaches can be used to prevent inequalities in education. We will also be exploring the (moderating and mediating) processes that underlie the relations between pedagogical agents’ ‘behaviors’ and learners’ performance by investigating psychological factors that strengthen or reduce the effects of ITS on learners’ motivation and emotions. As a third and final objective, we intend to create a robotic learning companion that keeps an updated model/simulation of the learner and their current knowledge, motivational, and emotional state, acting accordingly.


Project Results

Analytic side:

Higher achieving students who worked with an ITS that adapted to both their in-system performance and emotional experience experienced less boredom than lower achieving students and students working with an ITS that adapted to their in-system performance only. These findings suggest possible benefits of using emotional information to increase the contingency of the hint delivery strategy and improve students’ learning experience.

Thus, learners use ITS differently and therefore effectiveness of these systems varies between students. The team identified distinct behavioral patterns of students in an ITS which were differentially advantageous for motivational-affective characteristics and that were related to students’ characteristics (performance level) and the kind of adaptivity of the artificial learning companion (emotionally and cognitively adaptive vs. cognitively adaptive only). These findings emphasize the relevance of identifying interindividual differences in learners’ behavior when interacting with artificially intelligent learning companions. Theoretical concepts of adaptivity thus should not only consider adapting instruction to learners’ needs, but also supporting students in the learning process effectively as not everyone might benefit from such settings equally.

Synthetic and Analytic side:

Learners’ perception of robotic social agents was associated with emotional experience during learning as well as with task performance. More specifically, learners who perceived the robotic tutor agent as less disturbing during social interaction in a learning situation experienced more initial on-task enjoyment while learning and students who perceived the robotic tutor agent as less sociable performed better in the task. On-task enjoyment was associated with higher task performance. Thus, although robotic tutor agents may increase initial on-task enjoyment, certain human-like characteristics may hinder learning through emotional experience.

Learners who were more interested in the topic taught in the ITS experienced more enjoyment while learning and were more engaged in the task. In turn, they also performed better in the ITS. Further, unlike enjoyment, boredom had only minimal negative effects on engagement and task performance. These findings identify key psychological processes involved in the contribution of affective and cognitive processes to learning in the ITS context. Students in the experimental condition perceived Cozmo (non-humanoid robot) as more sociable than their counterparts in the control condition and students perceiving Cozmo as more sociable were associated with larger gains in topic-related interest, utility and importance. These findings are surprising since Pepper’s (not Cozmo’s) embodied presence and enhanced human-like attributes in the experimental condition were expected to increase students’ perceptions of Human Agent Interaction (HAI) dimensions. Further analysis presented that higher perceptions of Cozmo’s sociability conveyed through its embodiment in the experimental condition contributed to increase the extrinsic motivational values (utility and importance) that learners ascribe to the learning content after the intervention.

The team created an open-source framework, Sobotify, which enables multiple robots (e.g. Pepper, NAO, Cozmo and MyKeepOn) to interact with learners as learning companions. In the framework, the team integrated open-source AI-based solutions (e.g. DeepFace for facial expression recognition, and VOSK for speech recognition, MediaPipe for gestures) to animate robots similar to humans. The developed framework aims to provide an environment for less to non-technical users to utilize robots as learning companions in their choice of scenarios.

By using the features in Sobotify, such as DeepFace, VOSK, and Mediapipe, Pepper was designed as an empathetic learning companion to support learners to converse more competently in a sales conversation. Even though there was in general increase in learning gain, most high performing students did not perceive that the interactive learning experience increased their skills in a sales conversation.

An unexpected movement that Pepper performed during the experiment resulted in learners’ perception of the robot as robotic.

An HRI experiment for the language learning scenario was created and conducted at a vocational school. In this experiment, 4 different robots (Pepper, NAO, Cozmo and MyKeepOn) were controlled using Sobotify and emotion detection and emotionally adaptive feedback was provided to the participants. Similar to the previous study finding, Cozmo was found to be the most effective learning companion in the variables of agent value and facilitation in Pedagogical Agents as Learning companions (PAL) and Pepper was found to be credible and encouraging learners to reflect on what they are learning. Reviewing the design of the robots, all robots were attending to learners by maintaining eye contact, speaking, gesturing and responding to the detected emotions from learners by providing encouraging feedback. Two features that were different in Cozmo from other robots were, 1) “navigation” where Cozmo was not only gesturing with its arms (e.g., lift) but also moving closer to the learner and also retracting back and 2) “facial expression” where Cozmo displayed different facial expressions (e.g., smiling eyes to express happy, blinking eyes). The overall effectiveness of Cozmo as a learning companion may have been rated higher than other robots by learners due to Cozmo not only being co-present with learners but also due to dynamically sharing the physical space with learners along with expressing emotions in multiple ways (e.g., gesture, verbal and facial expression).

Through workshops with teachers, the team observed that Sobotify enabled teachers to easily adopt robots as learning companions in their teaching and learning scenarios. As it is an ongoing study, future studies will investigate deeper into how human users with less to non-technical background utilize synthetic systems such as robots and how the synthetic systems can reflect to humans.

The team better understands the key principles of social intelligence as their experimental studies showed that learning in social interaction requires adaptive interactions on both sides, the learner and the teacher – and that this concept is relevant for both the synthetic and the analytic side.


Yun, H. S., Karl, M., & Fortenbacher, A. (2020). Designing an interactive second language learning scenario: a case study of Cozmo. Proceedings of HCI Korea, 384–387.
Yun, H. S., Fortenbacher, A., Geißler, S., & Heumos, T. (2020). Towards External Regulation of Emotions Using Sensors: Tow Case Studies. INTED2020, 9313–9320. https://doi.org/10.21125/inted.2020.2576
Yun, H. S., Chevalère, J., Karl, M., & Pinkwart, N. (2021). A comparative study on how social robots support learners’ motivation and learning. 14th Annual International Conference of Education, Research and Innovation, 2845–2850. https://doi.org/10.21125/iceri.2021.0708
Yun, H. S., Hübert, H., Taliaronak, V., Mayet, R., Kirtay, M., Hafner, V. V., & Pinkwart, N. (2022). AI-based Open-Source Gesture Retargeting to a Humanoid Teaching Robot. AIED 2022: The 23rd International Conference on Artificial Intelligence in Education. https://link.springer.com/chapter/10.1007/978-3-031-11647-6_51
Yun, H. S., Hübert, H., Taliarona, V., & Sardogan, A. (2022). Utilizing Machine Learning based Gesture Recognition Software, Mediapipe, in the Context of Education and Health. AI Innovation Summit 2022.
Yun, H. S., Taliaronak, V., Kirtay, M., Chevelère, J., Hübert, H., Hafner, V. V., Pinkwart, N., & Lazarides, R. (2022). Challenges in Designing Teacher Robots with Motivation Based Gestures. 17th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022). https://arxiv.org/abs/2302.03942
Yun, H. S., Hübert, H., Chevalere, J., Pinkwart, N., Hafner, V., & Lazarides, R. (2023). Analyzing Learners’ Emotion from an HRI experiment using Facial Expression Recognition Systems. 25th International Conference on Human-Computer Interaction. https://doi.org/10.1007/978-3-031-34550-0_29
Yun, H. S., Hübert, H., Sardogan, A., Pinkwart, N., Hafner, V., & Lazarides, R. (2023). Humanoid Robot as a Debate Partner. 25th International Conference on Human-Computer Interaction. https://doi.org/10.1007/978-3-031-36004-6_74
Yun, H. S., & Fortenbacher, A. (2019). Listen to your body: making learners aware of their cognitive and affective state. ICER2019. https://docs.google.com/document/d/1D3XlYPBKg-Z7UoR1SunXZrAJy7oojV9S8N_OgjQQ5ck/edit
Spatola, N., Chevalère, J., & Lazarides, R. (2021). Human vs computer: What effect does the source of information have on cognitive performance and achievement goal orientation? Paladyn, Journal of Behavioral Robotics, 12(1), 175–186. https://doi.org/10.1515/pjbr-2021-0012
Spatola, N., & Wudarczyk, O. (2021). Ascribing emotions to robots: Explicit and implicit attribution of emotions and perceived robot anthropomorphism. Computers in Human Behavior, 124, 106934. https://doi.org/10.1016/j.chb.2021.106934
Lazarides, R., & Raufelder, D. (2021). Control-value theory in the context of teaching: does teaching quality moderate relations between academic self-concept and achievement emotions? British Journal of Educational Psychology, 91(1), 127–147. https://doi.org/10.1111/bjep.12352
Lazarides, R., & Chevalère, J. (2021). Artificial intelligence and education: Addressing the variability in learners’ emotion and motivation with adaptive teaching assistants. Bildung Und Erziehung, 74(3), 264–279. https://doi.org/10.13109/buer.2021.74.3.264
Kirtay, M., Chevalère, J., Lazarides, R., & Hafner, V. V. (2021). Learning in Social Interaction: Perspectives from Psychology and Robotics. 2021 IEEE International Conference on Development and Learning (ICDL), 1–8. https://doi.org/10.1109/ICDL49984.2021.9515648
Hübert, H., & Yun, H. S. (2024). Sobotify: A Framework for Turning Robots into Social Robots. Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), March 11–14, 2024, Boulder, CO, USA.
Henke, A., Chevalère, J., Omarchevska, Y., Yun, H. S., Pinkwart, N., Hafner, V., & Lazarides, R. (2024). Behavioral Profiles, Motivation and Emotions in an Intelligent Tutoring System. International Conference on Motivation (ICM) 2024.
Henke, A., Chevalère, J., Westphal, A., Yun, H. S., Pinkwart, N., Hafner, V., & Lazarides, R. (2024). Gender-related perceptions of robotic tutoring agents in human-robot interaction. Gender & STEM 2024.
Henke, A., Chevalère, J., Yun, H. S., Pinkwart, N., Hafner, V., & Lazarides, R. (2024). Lernen mit sozialen Robotern und intelligenten tutoriellen Systemen - Welche Rolle spielen wahrgenommenes soziales Interaktionsverhalten des Roboters, Lernfreude und kognitive Belastung für den Lernfortschritt? GEBF 2024.
Henke, A., Harley, J. M., Matin, N., & Lazarides, R. (2025). Arousal and Traits Matter More than States: Situated Emotions in Technology-Enhanced Learning. EARLI 2025.
Chevalère, J., Kirtay, M., Hafner, V., & Lazarides, R. (2022). Who to Observe and Imitate in Humans and Robots: The Importance of Motivational Factors. International Journal of Social Robotics. https://doi.org/10.1007/s12369-022-00923-9
Chevalère, J., Lazarides, R., Yun, H. S., Henke, A., Lazarides, C., Pinkwart, N., & Hafner, V. (2023). Do instructional strategies considering activity emotions reduce students’ boredom in a computerized open-ended learning environment? Computers & Education, 196. https://doi.org/10.1016/j.compedu.2023.104741
Ackermann, H., Henke, A., Chevalère, J., Yun, H. S., Hafner, V. V., Pinkwart, N., & Lazarides, R. (2025). Physical embodiment and anthropomorphism of AI tutors and their role in student enjoyment and performance. Npj Science of Learning, 10(1). https://doi.org/10.1038/s41539-024-00293-z

Research

An overview of our scientific work

See our Research Projects