Recognizing touch gestures for social human-robot interaction


Altuʇlu T. B., Altun K.

ACM International Conference on Multimodal Interaction, ICMI 2015, Washington, United States Of America, 9 - 13 November 2015, pp.407-413 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1145/2818346.2830600
  • City: Washington
  • Country: United States Of America
  • Page Numbers: pp.407-413
  • Keywords: Feature selection, Gesture recognition, Human-robot interaction, Random forests, Sequential floating forward search
  • Yıldız Technical University Affiliated: No

Abstract

In this study, we performed touch gesture recognition on two sets of data provided by "Recognition of Social Touch Gestures Challenge 2015". For the first dataset, dubbed Corpus of Social Touch (CoST), touch is performed on a mannequin arm, whereas for the second dataset (Human-Animal Affective Robot TouchλHAART) touch is performed in a human-pet interaction setting. CoST includes 14 gestures and HAART includes 7 gestures. We used the pressure data, image features, Hurst exponent, Hjorth parameters and autoregressive model coefficients as features, and performed feature selection using sequential forward floating search. We obtained classification results around 60%-70% for the HAART dataset. For the CoST dataset, the results range from 26% to 95% depending on the selection of the training/test sets.