Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: Journal of Medical Internet Research

Date Submitted: Dec 19, 2022
Date Accepted: Apr 26, 2023

The final, peer-reviewed published version of this preprint can be found here:

Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study

Solans D Sr, Ramírez-Cifuentes D, Ríssola E, Freire A

Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study

J Med Internet Res 2023;25:e45184

DOI: 10.2196/45184

PMID: 37289496

PMCID: 10288345

Gender bias when using Artificial Intelligence to assess Anorexia Nervosa on Social Media

  • David Solans Sr; 
  • Diana Ramírez-Cifuentes; 
  • Esteban Ríssola; 
  • Ana Freire

ABSTRACT

Background:

Social media sites are becoming an increasingly important source for information about mental disorders. Among them, eating disorders are complex psychological problems that involve unhealthy eating habits. In particular, there is evidence showing that signs and symptoms of Anorexia Nervosa can be traced in social platforms. Knowing that input data biases tend to get amplified by artificial intelligence algorithms and, in particular, machine learning, these methods should be revised in order to mitigate biased discrimination on such important domains.

Objective:

The main goal of this work is to detect and analyze performance disparities across genders on algorithms trained for the detection of anorexia nervosa on social media posts. We use a collection of automated predictors trained on a dataset containing cases of users that show signs of Anorexia, and control cases.

Methods:

We first inspect predictive performance differences among algorithms for male and female users. Once biases are detected, we apply a feature-level bias characterization to evaluate the source of such biases, and perform a comparative analysis of such features with those that are relevant for clinicians. Finally, we showcase different bias mitigation strategies in order to develop fairer automated classifiers, specially for risk assessment in sensitive domains.

Results:

Our results reveal concerning predictive performance differences, with significantly higher false negative rates for female samples (FNR = 0.082) compared to male samples (FNR = 0.005). Findings show that biological processes and suicide risk factors are relevant for classifying positive males cases, while age, emotions and personal concerns are more relevant for females. We also propose techniques for bias mitigation and we can see that even though disparities can be mitigated they can not be eliminated.

Conclusions:

We conclude that more attention should be paid to the assessment of biases in automated methods dedicated to the detection of mental health issues. This is in particular relevant prior to the deployment of systems thought to assist clinicians. Especially considering that the outputs of such systems can have an impact in the diagnosis of people at risk.


 Citation

Please cite as:

Solans D Sr, Ramírez-Cifuentes D, Ríssola E, Freire A

Gender Bias When Using Artificial Intelligence to Assess Anorexia Nervosa on Social Media: Data-Driven Study

J Med Internet Res 2023;25:e45184

DOI: 10.2196/45184

PMID: 37289496

PMCID: 10288345

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.