Presentation title
Perception-Comprehension-Action Cycle Enhancement based on Emotions in Human-Robot InteractionsAuthors
Daniela De Venuto, Giovanni Mezzina, Michele Ruta, Eugenio Di SciascioInstitution(s)
Politecnico di BariPresentation type
Presentation of a research group from one or more scientific institutionsAbstract
Emotion recognition systems are recently gaining attention of the scientific community due to its wide range of applications. They are used in the marketing to provide an immersive experience of purchase (e.g., by suggesting products which evoke some positive experience in the buyer), in the advanced driver assistance systems (e.g., smoothing the nervous driving style), in the psychophysical monitoring of patients during drugs treatment, here is applied for the enhancement of the affective loop in the social human-robot interactions (HRIs). This work presents a generally applicable emotion recognition system, which can discriminate up to 8 different emotions basing its working on the arousal-valence-dominance model. For the purpose, the proposed system exploits not-maskable physiological signals, such as the user cortical activity from five different cortical areas. The system operates in two main phases: the calibration phase and the online emotions discrimination phase. During the calibration, the system exploits a novel systematic grid search-based routine for the feature extraction (FE) step. It jointly analyzes the implementation ease of several selected FE techniques (via dedicated software metrics) and the related accuracy. Next, the system selects the optimal features combination that maximize the accuracy, minimizing the implementation complexity. To speed up the real time discrimination, a Classifier in the Loop (CIL) approach is embedded for feature selection purposes. Finally, the selected features are used to train a set of three classifiers (one per each arousal, valence, and dominance parameter), realizing a specific knowledge classifier committee. Experimental tests on a public dataset (i.e., the DEAP) demonstrated how the proposed system can effectively impacts on the state-of-the-art, by achieving an accuracy of ~75 % in a discrimination of 8 emotions only by using cortical signals.