Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Mar 21, 2019
Open Peer Review Period: Mar 25, 2019 - May 6, 2019
Date Accepted: Feb 9, 2020
(closed for review but you can still tweet)
Towards Continuous Social Phenotyping: Analyzing Gaze Patterns in an Emotion Recognition Task for Children with Autism through Wearable Smart Glasses
ABSTRACT
Background:
Recent studies have shown that facial attention differs in children with autism. Measuring eye gaze and emotion recognition in children with autism is challenging, as standard clinical assessments must be delivered in clinical settings by a trained clinician. Wearable technologies may be able to bring eye gaze and emotion recognition into naturalistic social interactions.
Objective:
To test (1) the feasibility of tracking gaze using wearable smart glasses during a facial expression recognition task and (2) the ability of these gaze tracking data, together with facial expression recognition responses, to distinguish children with autism from neurotypical controls.
Methods:
We compare the eye gaze and emotion recognition patterns of 16 children with autism spectrum disorder (ASD) and 17 children without ASD via wearable smart glasses fitted with a custom eye tracker. Children identified static facial expressions of images presented on a computer screen along with non-social distractors while wearing Google Glass and the eye tracker. Faces were presented in three trials, during one of which children received feedback in the form of the correct classification. We employed hybrid human-labeling and computer vision-enabled methods for pupil tracking and world-eye gaze translation calibration. We analyzed the impact of gaze and emotion recognition features in a prediction task aiming to distinguish ASD vs. neurotypical control (NC) participants.
Results:
Gaze and emotion recognition patterns enabled training of a classifier distinguishing ASD and NC groups. However, it was unable to significantly outperform classifiers given only age and gender metadata, suggesting that further work is necessary to disentangle these effects.
Conclusions:
While wearable smart glasses show promise in identifying subtle differences in gaze tracking and emotion recognition patterns in children with and without ASD, the present form factor and data do not allow for these differences to be reliably exploited by machine learning systems. Resolving these challenges will be an important step towards continuous tracking of the ASD phenotype.
Citation

Per the author's request the PDF is not available.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.