18th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2024, İstanbul, Turkey, 27 - 31 May 2024
Signals from electroencephalography (EEG) are considered to be very useful in identifying affective states. This study introduces a novel methodology for eliciting affective responses using virtual reality, featuring an immersive virtual environment in which subjects encounter an avatar displaying the six cardinal emotions: fear, joy, anger, sadness, disgust, and surprise. Subjects are asked to mimic the avatar's expressions while their cerebral activity is recorded using a multi-channel EEG apparatus with advanced dry sensor technology to reduce interference from facial kinetics and musculature. The study's cohort consisted of 40 individuals, each contributing 720 data points per emotional stimulus. A key aspect of this research is creating and releasing a novel dataset (EmoNeuroDB) re-sulting from this endeavor, which served as the foundation for a competition at the FG 2024. This paper describes the winner's methodological framework, emphasizing the dataset's importance as a primary contribution to the field.