Accepted for/Published in: JMIR Medical Education
Date Submitted: Nov 19, 2024
Date Accepted: Oct 7, 2025
Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.
An adaptation of eCREST, an online patient simulation tool to support clinical reasoning: A mixed-methods evaluation comparing clinical reasoning styles of physician associate and medical students
ABSTRACT
Background:
Clinical reasoning is increasingly recognised as an important skill in the diagnosis of serious conditions. eCREST (electronic Clinical Reasoning Educational Simulation Tool), a clinical reasoning learning resource was developed to support medical students to learn clinical reasoning. However, primary care teams now encompass a wider range of professional groups, such as Physician Associates [PAs] who also have a need to develop clinical reasoning during their training.
Objective:
We sought to evaluate the suitability of eCREST for physician associates, by comparing reasoning styles of PA and medical students using the tool.
Methods:
Between 2017 and 2021, PA students and medical students used eCREST to learn clinical reasoning skills in an experimental or learning context. Students undertook a simulated case of a patient presenting with chest pain. They could ask questions, order bedside tests, and select physical exams during the case, to help them form, reflect on and reconsider diagnostic ideas and management strategies while completing a case. Exploratory analysis was undertaken by comparing students’ data gathering, flexibility in diagnosis and diagnostic ideas between medical and PA students.
Results:
In total 159 medical students and 54 PA students completed the case. PAs were older (M=27±7 vs. M=24±4 years, p<.001) and more likely to be female (80% vs. 53%, p<.001). Medical and PA students were similar in the proportion of essential questions asked (M=70.13±22.24 vs. M=73.24±17.40, p=.326), physical examinations requested (M=54.7±18.6 vs. M=54.0±21.1, 50, p=.586), bedside tests selected (M=74.4±29.1 vs. M=83.3±28.8, p=.053), and number of times they changed their diagnoses (M=2.8±1.4 vs. M=2.8±1.5, p=.993). PA students initially included a greater proportion of relevant diagnoses (M=51.5±14.2 vs. M=43.4±14.4, p<.001), but both student groups improved during the case, particularly medical students (p=.046), and finished the case with similar proportions of relevant diagnoses (M=65.7±14.6 vs. M=62.9±15.5, p=.226).
Conclusions:
These results provide suggestive evidence that eCREST can serve the needs of both medical and PA students in developing clinical reasoning skills to support diagnosis in primary care.
Citation
The author of this paper has made a PDF available, but requires the user to login, or create an account.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.