Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Mar 24, 2024
Open Peer Review Period: Apr 1, 2024 - May 27, 2024
Date Accepted: Nov 25, 2024
(closed for review but you can still tweet)
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
Identification of Intracranial Germ Cell Tumors Based on Facial Photo: Software Development Using Deep Learning Technology
ABSTRACT
Background:
Primary intracranial germ cell tumors(iGCTs) are highly malignant brain tumors that predominantly occur in children and adolescents, with an incidence rate ranking third among primary brain tumors in East Asia(8-15%). Due to their insidious onset and impact on critical functional areas of the brain, these tumors often result in irreversible abnormalities in growth and development, as well as cognitive and motor impairments in affected children. Therefore, early diagnosis is vital for improving patient outcomes and quality of life.
Objective:
This study investigates the application of facial recognition technology in the early detection of intracranial germ cell tumors (iGCTs) in children and adolescents. Early diagnosis through advanced screening techniques is vital for improving patient outcomes and quality of life.
Methods:
The study comprised the collection and division of datasets into training (n=574, iGCTs=358, control=189) and validation (n=142, iGCTs=79, control=63), with an additional independent test dataset (n=236, iGCTs=130, control=106) sourced from four medical institutions. A regression model using clinically relevant, statistically significant data was developed and combined with GVisageNet outputs to create a hybrid model. This integration sought to assess the incremental value of clinical data. The model's predictive mechanisms were explored through correlation analyses with endocrine indicators and stratified evaluations based on the degree of hypothalamic-pituitary-target (HPT) axis damage. Performance metrics included area under the curve (AUC), accuracy, sensitivity, and specificity.
Results:
The independent test set showed GVisageNet achieving an AUC of 0.739, superior to the regression model alone (AUC=0.632, P<.001) but less than the hybrid model (AUC=0.789, P=.04). Significant correlations were found between the model's outputs and nine endocrine indicators. Performance varied with HPT axis damage, indicating a further understanding of the working mechanism of GVisageNet.
Conclusions:
GVisageNet offers a promising approach for the early detection of iGCTs, emphasizing the importance of combining deep learning with clinical data for personalized healthcare.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.