SCIoI Alumni

Hae Seon Yun

Doctoral Researcher

Computer Science

HU Berlin

 

Email:
yunhaese@informatik.hu-berlin.de

 

Photo: SCIoI

← Alumni Overview

Hae Seon Yun

Hae Seon Yun

Photo: SCIoI

Haeseon was a Ph.D candidate at Humboldt University Berlin and she is working on her dissertation titled “Multimodal Learning Companions”. Previously in SCIoI, she has participated as a doctoral researcher for SCIoI project 06, “From understanding learners’ adaptive motivation and emotion to designing social learning companions”. Her role in the integration project was to consolidate concepts of intelligence discussed in SCIoI and construct the demonstrator. She enjoys the interdisciplinary research environment and is passionate about bringing in diverse perspectives from multidisciplinary fields into research.


Projects

Hae Seon Yun is member of Project 06, Project B2.


Yun, H. S., Hübert, H., Pinkwart, N., & Hafner, Verena. V. (2024). Design Based Research on Multimodal Robotic Learning Companion. 25th International Conference on Artificial Intelligence in Education. https://doi.org/10.1007/978-3-031-64312-5_12
Hübert, H., & Yun, H. S. (2024). Sobotify: A Framework for Turning Robots into Social Robots. Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’24), March 11–14, 2024, Boulder, CO, USA.
Chevalère, J., Lazarides, R., Yun, H. S., Henke, A., Lazarides, C., Pinkwart, N., & Hafner, V. (2023). Do instructional strategies considering activity emotions reduce students’ boredom in a computerized open-ended learning environment? Computers & Education, 196. https://doi.org/10.1016/j.compedu.2023.104741
Yun, H. S., Hübert, H., Sardogan, A., Pinkwart, N., Hafner, V., & Lazarides, R. (2023). Humanoid Robot as a Debate Partner. 25th International Conference on Human-Computer Interaction. https://doi.org/10.1007/978-3-031-36004-6_74
Yun, H. S., Hübert, H., Chevalere, J., Pinkwart, N., Hafner, V., & Lazarides, R. (2023). Analyzing Learners’ Emotion from an HRI experiment using Facial Expression Recognition Systems. 25th International Conference on Human-Computer Interaction. https://doi.org/10.1007/978-3-031-34550-0_29
Yun, H. S., Taliaronak, V., Kirtay, M., Chevelère, J., Hübert, H., Hafner, V. V., Pinkwart, N., & Lazarides, R. (2022). Challenges in Designing Teacher Robots with Motivation Based Gestures. 17th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI 2022). https://arxiv.org/abs/2302.03942
Yun, H. S., Hübert, H., Taliaronak, V., Mayet, R., Kirtay, M., Hafner, V. V., & Pinkwart, N. (2022). AI-based Open-Source Gesture Retargeting to a Humanoid Teaching Robot. AIED 2022: The 23rd International Conference on Artificial Intelligence in Education. https://link.springer.com/chapter/10.1007/978-3-031-11647-6_51
Yun, H. S., Hübert, H., Taliarona, V., & Sardogan, A. (2022). Utilizing Machine Learning based Gesture Recognition Software, Mediapipe, in the Context of Education and Health. AI Innovation Summit 2022.
Yun, H. S., Chevalère, J., Karl, M., & Pinkwart, N. (2021). A comparative study on how social robots support learners’ motivation and learning. 14th Annual International Conference of Education, Research and Innovation, 2845–2850. https://doi.org/10.21125/iceri.2021.0708
Yun, H. S., Fortenbacher, A., Geißler, S., & Heumos, T. (2020). Towards External Regulation of Emotions Using Sensors: Tow Case Studies. INTED2020, 9313–9320. https://doi.org/10.21125/inted.2020.2576
Yun, H. S., Karl, M., & Fortenbacher, A. (2020). Designing an interactive second language learning scenario: a case study of Cozmo. Proceedings of HCI Korea, 384–387.
Yun, H. S., & Fortenbacher, A. (2019). Listen to your body: making learners aware of their cognitive and affective state. ICER2019. https://docs.google.com/document/d/1D3XlYPBKg-Z7UoR1SunXZrAJy7oojV9S8N_OgjQQ5ck/edit

Research

An overview of our scientific work

See our Research Projects