Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Human Factors

Date Submitted: Jan 4, 2023
Date Accepted: Nov 29, 2023

The final, peer-reviewed published version of this preprint can be found here:

The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study

Fiorini L, D'Onofrio G, Sorrentino A, Cornacchia Loizzo FG, Russo S, Ciccone F, Giuliani F, Sancarlo D, Cavallo F

The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study

JMIR Hum Factors 2024;11:e45494

DOI: 10.2196/45494

PMID: 38277201

PMCID: 10858416

On the role of coherent robot behaviour and embodiment in emotion perception and recognition during interaction: experimental study

  • Laura Fiorini; 
  • Grazia D'Onofrio; 
  • Alessandra Sorrentino; 
  • Federica Gabriella Cornacchia Loizzo; 
  • Sergio Russo; 
  • Filomena Ciccone; 
  • Francesco Giuliani; 
  • Daniele Sancarlo; 
  • Filippo Cavallo

ABSTRACT

Social robots are becoming increasingly important in our daily life as companion. Therefore, humans expect to interact with them using the same mental rules applied to human-human interaction, including the use of non-verbal channels. Over the last years, research efforts are devoted to understanding users’ needs and mental models and to develop behavioral models for robots that can perceive the user state and properly plan a reaction. In this context, the aim of this paper is dual. Firstly, it aims to assess the user’s perceived emotions, in terms of valence, arousal, and dominance, when interacting with a robot that shows/does not show non-verbal cues with respect to the same emotions elicited through a personal computer; secondly, it aims to evaluate the robot performance in the recognition of emotions using three supervised machine learning, namely Support Vector Machine, Random Forest, and K-Nearest Neighbor. Particularly, this paper presents a study where 60 healthy subjects were requested to interact with Pepper robot to elicit three “basic” emotions (i.e. positive, negative, and neutral) using a set of 60 images, retrieved from a standardized database thus to investigate i) the role of the robot embodiment and ii) its coherent movement respectively in emotion’s perception and recognition. The results show significant differences in the user perceived emotion with respect to the web interface underlying the importance of the robot non-verbal communication and embodiment. Additionally, the results underline a good robot’s emotion recognition rate (accuracy higher than 0.85 with the best classifiers), suggesting that the use of multimodal communication channels improves the recognition of the user's emotional state.


 Citation

Please cite as:

Fiorini L, D'Onofrio G, Sorrentino A, Cornacchia Loizzo FG, Russo S, Ciccone F, Giuliani F, Sancarlo D, Cavallo F

The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study

JMIR Hum Factors 2024;11:e45494

DOI: 10.2196/45494

PMID: 38277201

PMCID: 10858416

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.