Accepted for/Published in: JMIR mHealth and uHealth
Date Submitted: Jan 14, 2020
Date Accepted: Aug 3, 2020
Using Machine Learning and Smartphone and Smartwatch Data to Detect Emotional States and Transitions: An Exploratory Study
ABSTRACT
Background:
Emotional state in everyday life is an essential indicator of health and wellbeing. However, daily assessment of emotional states largely depends on active self-reports, which is often inconvenient and prone to incomplete information. Automated detection of emotional states and transitions based on daily context could be an effective solution to this problem. Yet, the relationship between the emotional transitions and everyday contexts remains unexplored.
Objective:
The objective of this study is two-fold: 1) to explore the relationship between contextual information and emotional transitions/states, 2) to evaluate the feasibility of detecting emotional transitions and states from daily contextual information using machine learning techniques.
Methods:
This study was conducted on 18 persons’ data from a publicly available dataset called ExtraSensory. Contextual and sensor data were collected using smartphone and smartwatch sensors in a free-living condition, where the number of days for each person varied from three to nine. Sensors include an accelerometer, gyroscope, compass, location services, microphone, phone state indicator, light, temperature, and barometer. The users self-reported approximately 49 discrete emotions at different intervals via a smartphone application throughout the data collection period. We mapped the 49 reported discrete emotions to the three dimensions of the PAD (Pleasure, Arousal, Dominance) model and considered six states of emotions: discordant, pleased, dissuaded, aroused, submissive, and dominant. We built general as well as personalized models for detecting emotional transitions and states every five minutes. The transition detection problem was a binary classification problem detecting whether the person’s emotional state has changed over time or not, whereas state detection was a multi-class classification problem. In both cases, a wide range of supervised machine learning algorithms was leveraged, in addition to data preprocessing, feature selection, and data imbalance handling techniques. Lastly, an assessment has been conducted to shed light on the association between everyday context and emotional states.
Results:
This study obtained promising results for emotional state and transition detection. The best AUROC of emotional state detection reached 60.55 % in the general models and an average of 96.33% across personalized models. Despite the highly imbalanced data, the best AUROC for emotional transition detection reached 90.5% in the general models and an average of 88.73% across personalized models. Feature analysis shows that spatio-temporal context, phone state, and motion-related information are the most informative for emotional state/transition detections. Our assessment showed that there is an impact of lifestyle on the predictability of emotion.
Conclusions:
Our results demonstrate a strong association of daily context with emotional states and transitions as well as the feasibility of detecting them using data from smartphone and smartwatch sensors. Clinical Trial: N/A
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.