Towards Context-Aware Facial Emotion Reaction Database for Dyadic Interaction Settings


Creative Commons License

Sham A. H., Khan A., Lamas D., Tikka P., Anbarjafari G.

Sensors, cilt.23, sa.1, 2023 (SCI-Expanded) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 23 Sayı: 1
  • Basım Tarihi: 2023
  • Doi Numarası: 10.3390/s23010458
  • Dergi Adı: Sensors
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Aerospace Database, Aqualine, Aquatic Science & Fisheries Abstracts (ASFA), Biotechnology Research Abstracts, CAB Abstracts, Communication Abstracts, Compendex, EMBASE, INSPEC, MEDLINE, Metadex, Veterinary Science Database, Directory of Open Access Journals, Civil Engineering Abstracts
  • Anahtar Kelimeler: facial expression analysis, emotion recognition, emotion reaction, responsible AI, data collection
  • Yıldız Teknik Üniversitesi Adresli: Hayır

Özet

© 2023 by the authors.Emotion recognition is a significant issue in many sectors that use human emotion reactions as communication for marketing, technological equipment, or human–robot interaction. The realistic facial behavior of social robots and artificial agents is still a challenge, limiting their emotional credibility in dyadic face-to-face situations with humans. One obstacle is the lack of appropriate training data on how humans typically interact in such settings. This article focused on collecting the facial behavior of 60 participants to create a new type of dyadic emotion reaction database. For this purpose, we propose a methodology that automatically captures the facial expressions of participants via webcam while they are engaged with other people (facial videos) in emotionally primed contexts. The data were then analyzed using three different Facial Expression Analysis (FEA) tools: iMotions, the Mini-Xception model, and the Py-Feat FEA toolkit. Although the emotion reactions were reported as genuine, the comparative analysis between the aforementioned models could not agree with a single emotion reaction prediction. Based on this result, a more-robust and -effective model for emotion reaction prediction is needed. The relevance of this work for human–computer interaction studies lies in its novel approach to developing adaptive behaviors for synthetic human-like beings (virtual or robotic), allowing them to simulate human facial interaction behavior in contextually varying dyadic situations with humans. This article should be useful for researchers using human emotion analysis while deciding on a suitable methodology to collect facial expression reactions in a dyadic setting.