Accepted for/Published in: JMIR mHealth and uHealth
Date Submitted: Feb 15, 2019
Open Peer Review Period: Feb 19, 2019 - Apr 16, 2019
Date Accepted: Jul 19, 2019
(closed for review but you can still tweet)
Opportunities and pitfalls in applying emotion recognition software for persons with a visual impairment in real life applications.
ABSTRACT
Background:
A large part of the communication cues exchanged between persons are nonverbal. Persons with a visual impairment (PVIs) are often unable to perceive these cues such as facial expressions of emotions. In a previous study we have determined that PVIs can increase their ability to recognize facial expressions of emotions from validated pictures and videos by using an emotion recognition system which signals vibrotactile cues associated to one of six basic emotions.
Objective:
In this study, we determined whether an emotion recognition system to support persons with a visual impairment (PVIs) worked as well in realistic situations as it did under controlled lab conditions.
Methods:
The emotion recognition system consists of a camera mounted on spectacles, a tablet running facial emotion recognition software, and a waist belt with vibrotactors to provide haptic feedback representing Ekman“s six universal emotions. Eight PVIs (four females, four males, mean age = 46.75, age range = 28-66) participated in two training sessions followed by one experimental session. During the experiment, participants engaged in two 15-minute conversations, in one of which they wore the emotion recognition system. To conclude the study, exit-interviews were conducted to assess the experiences of the participants. Due to technical issues with the registration of the emotion recognition software, only six participants were included in the video analysis.
Results:
We found that participants were quickly able to learn, distinguish, and remember vibrotactile signals associated with the six emotions. Four participants felt they were able to use the vibrotactile signals in the conversation. Five out of six participants had no difficulties in keeping the camera focused on the conversation partner. The emotion recognition was very accurate in detecting happiness but performed unsatisfactorily in recognizing the other five universal emotions.
Conclusions:
The system requires some essential improvements in performance and wearability before it is ready to support PVIs in their daily life interactions. Nevertheless, the participants saw potential in the system as an assistive technology, assuming their user requirements can be met.
Citation
Per the author's request the PDF is not available.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.