Predicting an Individual’s Gestures from the Interlocutor’s Co-occurring Gestures and Related Speech

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningpeer review

Abstract

Overlapping speech and gestures are common in
face-to-face conversations and have been interpreted as a sign
of synchronization between conversation participants. A number
of gestures are even mirrored or mimicked. Therefore, we
hypothesize that the gestures of a subject can contribute to the
prediction of gestures of the same type of the other subject.
In this work, we also want to determine whether the speech
segments to which these gestures are related to contribute to
the prediction. The results of our pilot experiments show that a
Naive Bayes classifier trained on the duration and shape features
of head movements and facial expressions contributes to the
identification of the presence and shape of head movements
and facial expressions respectively. Speech only contributes to
prediction in the case of facial expressions. The obtained results
show that the gestures of the interlocutors are one of the
numerous factors to be accounted for when modeling gesture
production in conversational interactions and this is relevant to
the development of socio-cognitive ICT.
OriginalsprogEngelsk
TitelProceedings of the IEEE 7th International Conference on Cognitive Infocommunications
Antal sider5
ForlagIEEE Signal Processing Society
Publikationsdato2016
Sider233-237
ISBN (Trykt)978-1-5090-2644-9
StatusUdgivet - 2016

Citationsformater