Humans communicate their affective states through different media, both verbal and non-verbal, often used at the same time.
The knowledge of the emotional state plays a key role to provide personalized and context-related information and services. This is the main reason why several algorithms have been proposed in the last few years for the automatic emotion recognition.
In this work we exploit the correlation between one's affective state and the simultaneous body expressions in terms of speech and gestures. Here we propose a system for real-time emotion recognition from gestures. In a first step, the system builds a trusted dataset of association pairs (motion data → emotion pattern), also based on textual information. Such dataset is the ground truth for a further step, where emotion patterns can be extracted from new unclassified gestures. Experimental results demonstrate a good recognition accuracy and real-time capabilities of the proposed system.
This article is authored also by Synbrain data scientists and collaborators. READ THE FULL ARTICLE