Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Dec 29, 2024
Open Peer Review Period: Dec 29, 2024 - Feb 23, 2025
Date Accepted: Apr 15, 2025
(closed for review but you can still tweet)
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Development and evaluation of predictive models using machine learning to identify fetal growth restriction in pre-eclampsia patients
ABSTRACT
Background:
Fetal growth restriction is a common complication of preeclampsia. Fetal growth restriction (FGR) in patients with preeclampsia (PE) increases the risk of neonatal perinatal mortality and morbidity.
Objective:
The aim of this study was to develop a machine learning (ML) based on auxiliary diagnostic model to identify and predict the occurrence of FGR in PE patients.
Methods:
This study utilized a retrospective case-control approach to analyze the basic medical history and peripheral blood laboratory test results of pregnant patients with preeclampsia (PE), either complicated or not by fetal growth restriction (FGR). Machine learning model was constructed to evaluate the predictive value of maternal parameter changes on the PE combined with FGR. The SHapley Additive exPlanation method was used to rank the feature importance and explain the final model.
Results:
The random forest (RF) model performed best in discriminative ability among the 7 ML models. After reducing features according to feature importance rank, an explainable final RF model was established with 10 features. The final model could accurately predict FGR in both internal (AUC = 0.830) and external (AUC = 0.820) validations, and has been translated into a convenient tool to facilitate its utility in clinical settings.
Conclusions:
Our explainable ML model successfully developed a model that accurately predicts FGR development in preeclampsia women. The use of interpretable methods captures highly relevant risk factors for model interpretation, alleviating concerns about the “black box” problem of indirect interpretation of ML techniques.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.