Multi-Platform Intelligent System for Multimodal Human-Computer Interaction

Authors

  • Mateusz Jarosz Institute of Computer Science, AGH University of Science and Technology, Krakow, Poland
  • Piotr Nawrocki Institute of Computer Science, AGH University of Science and Technology, Krakow, Poland
  • Bartłomiej Śnieżyński Institute of Computer Science, AGH University of Science and Technology, Krakow, Poland
  • Bipin Indurkhya Institute of Philosophy, Jagiellonian University, Krakow, Poland

DOI:

https://doi.org/10.31577/cai_2021_1_83

Keywords:

Human–computer interaction, multi-platform, intelligent system architecture, multimodal system, humanoid robot

Abstract

We present a flexible human--robot interaction architecture that incorporates emotions and moods to provide a natural experience for humans. To determine the emotional state of the user, information representing eye gaze and facial expression is combined with other contextual information such as whether the user is asking questions or has been quiet for some time. Subsequently, an appropriate robot behaviour is selected from a multi-path scenario. This architecture can be easily adapted to interactions with non-embodied robots such as avatars on a mobile device or a PC. We present the outcome of evaluating an implementation of our proposed architecture as a whole, and also of its modules for detecting emotions and questions. Results are promising and provide a basis for further development.

Downloads

Download data is not yet available.

Downloads

Published

2021-08-03

How to Cite

Jarosz, M., Nawrocki, P., Śnieżyński, B., & Indurkhya, B. (2021). Multi-Platform Intelligent System for Multimodal Human-Computer Interaction. COMPUTING AND INFORMATICS, 40(1), 83–103. https://doi.org/10.31577/cai_2021_1_83