Presentation Description
Johanna Klutmann1
Constanze Dietzsch1, Ute Schlasius-Ratter2, Alexander Oksche2, Sara Volz-Willems1, Johannes Jäger1 and Fabian Dupont1
1 Department of Family Medicine, Saarland University
2 German Institute for State Examinations in Medicine, Pharmacy, Dentistry and Psychotherapy (IMPP), Mainz, Germany
Constanze Dietzsch1, Ute Schlasius-Ratter2, Alexander Oksche2, Sara Volz-Willems1, Johannes Jäger1 and Fabian Dupont1
1 Department of Family Medicine, Saarland University
2 German Institute for State Examinations in Medicine, Pharmacy, Dentistry and Psychotherapy (IMPP), Mainz, Germany
Background:
Clinical reasoning is a vital skill in medical education, encompassing various cognitive processes such as observation, historical information elicitation, physical maneuvers, hypothesis generation, and diagnostic test ordering (1). Several researchers have tried to make clinical reasoning as a cognitive process visible during assessment (2, 3). This study aims to visualize, categorize and compare clinical reasoning (strategies) in tablet-based multiple- choice question (MCQ) assessments and explore the thought process of academic high achievers and their correct or incorrect reasoning.
Summary of Work:
During winter-semester 2022/23, and summer-semester 2023 two times100 fifth-year medical students participated in the Year 5 Family-Medicine curriculum at Saarland University, Germany, culminating in a state-exam question-based exam. The exam comprised 60 MCQ- questions, including two two-step key-feature-questions. Self-assessment questions on clinical- reasoning were included after each MCQ. Literature-based deductive content-analysis was conducted, involving researcher triangulation to ensure consistency. The same process was repeated for the subsequent cohort in the summer-semester 2023.
Results:
This study introduces a novel approach to measuring clinical-reasoning in tablet-based-MCQ- assessment, shedding light on the thought-process during exams. It helps identify reasons for errors and whether students apply clinical-reasoning during exams. Furthermore, it may reveal whether high-achieving students demonstrate more forward reasoning, considered applicable in real-life medical scenarios. (to-be-completed)
Discussion:
The study underscores the importance of thought processes within clinical reasoning in a competency-based curriculum. It might help align teaching with assessment strategies to promote desirable clinical reasoning skills. It also highlights exam question quality weaknesses and shows clinical reasoning performance processes beyond summative results.
Conclusion:
Understanding clinical reasoning in medical education MCQ assessment helps to continuously improve MCQ and potentially reach higher competency-levels by continuously improving question- and assessment-formats. Incorporating such visualization exercises of clinical reasoning may lead to improved exam question quality and foster a clearer focus on forward- clinical-reasoning as a desirable thought-process among students.
References (maximum three)
- Eva KW (2005) What every teacher needs to know about clinical reasoning. Med Educ 39:98–106
- Beullens J, Struyf E, Van Damme B (2005) Do extended matching multiple-choice questions measure clinical reasoning? Med Educ 39:410–417
- Hrynchak P, Takahashi SG, Nayer M: Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ 2014; 48: 870–83