Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Assessment of Clinical Decision Making
Oral Presentation
Oral Presentation
1:30 pm
26 February 2024
M210
Session Program
1:30 pm
Gerard Corrigan1
Suzanne Estaphan2
1 School of Rural Medicine, Charles Sturt University
2 School of Medicine and Psychology, Australian National University
Suzanne Estaphan2
1 School of Rural Medicine, Charles Sturt University
2 School of Medicine and Psychology, Australian National University
Background:
To improve students’ clinical reasoning skills we require a process which provides visibility (Audetat, Laurin et al. 2017). Process mapping makes visible how students approach learning (Smith and Corrigan 2018).
To improve students’ clinical reasoning skills we require a process which provides visibility (Audetat, Laurin et al. 2017). Process mapping makes visible how students approach learning (Smith and Corrigan 2018).
Summary of work:
Process mapping was used to determine how students engaged with clinical reasoning over a problem-based learning (PBL) case. Using a proforma, students noted conscious decisions they made and were interviewed individually about these decisions. These data provided the material to construct process maps. These maps were categorised, coded and analysed for any decision-making related to clinical reasoning.
Process mapping was used to determine how students engaged with clinical reasoning over a problem-based learning (PBL) case. Using a proforma, students noted conscious decisions they made and were interviewed individually about these decisions. These data provided the material to construct process maps. These maps were categorised, coded and analysed for any decision-making related to clinical reasoning.
Results:
Of the 135 process maps 71% showed students engaging with clinical reasoning. Students engaged with clinical reasoning in a number of different ways and reflected metacognitively about their approaches to clinical reasoning.
Of the 135 process maps 71% showed students engaging with clinical reasoning. Students engaged with clinical reasoning in a number of different ways and reflected metacognitively about their approaches to clinical reasoning.
Discussion:
Students make decisions about how and why they engage with clinical reasoning. Process mapping captures this decision-making and reveals why students make the decisions they do. This degree of detail might just be the type of feedback required to support students to develop their clinical reasoning capabilities. Process mapping can contribute to a systematic approach to teaching clinical reasoning as urged by Cooper, Bartlett et al. (2021).
Students make decisions about how and why they engage with clinical reasoning. Process mapping captures this decision-making and reveals why students make the decisions they do. This degree of detail might just be the type of feedback required to support students to develop their clinical reasoning capabilities. Process mapping can contribute to a systematic approach to teaching clinical reasoning as urged by Cooper, Bartlett et al. (2021).
Conclusions:
Process mapping captures, in detail, how students approach clinical reasoning and reveals the reasoning of students when they do so. Process mapping is a suitable microanalytical assessment method for examining the detail of students use and understanding of clinical reasoning in the context of PBL.
Process mapping captures, in detail, how students approach clinical reasoning and reveals the reasoning of students when they do so. Process mapping is a suitable microanalytical assessment method for examining the detail of students use and understanding of clinical reasoning in the context of PBL.
Implications for practice and further research:
The acquisition of clinical reasoning may be facilitated using feedback provided by process mapping. If we can assess how students approach clinical reasoning outside of the formal clinical curriculum, we should. Process mapping has the potential to be developed into an authentic assessment tool for assessing clinical reasoning in PBL.
The acquisition of clinical reasoning may be facilitated using feedback provided by process mapping. If we can assess how students approach clinical reasoning outside of the formal clinical curriculum, we should. Process mapping has the potential to be developed into an authentic assessment tool for assessing clinical reasoning in PBL.
References (maximum three)
Audetat, M. C., S. Laurin, V. Dory, B. Charlin and M. R. Nendaz (2017). "Diagnosis and management of clinical reasoning difficulties: Part I. Clinical reasoning supervision and educational diagnosis." Med Teach: 1-5.
Cooper, N., M. Bartlett, S. Gay, A. Hammond, M. Lillicrap, J. Matthan, M. Singh and U.K.C.R.i.M.E.c.s. group (2021). "Consensus statement on the content of clinical reasoning curricula in undergraduate medical education." Med Teach 43(2): 152-159.
Smith, P. and G. Corrigan (2018). "How learners learn: A new microanalytic assessment method to map decision-making." Med Teach: 1-9.
1:45 pm
Larissa Ruczynski1
Bas Schouwenberg2, Eugène Custers3, Cornelia Fluit1 and Marjolein van de Pol4
1 Research on Learning and Education, Radboudumc Health Academy, Radboud University Medical Center Nijmegen, Netherlands
2 Department of Pharmacology and Toxicology and Department of Internal Medicine, Radboud University Medical Center Nijmegen, the Netherlands.
3 Department of Online Learning and Instruction, Faculty of Educational Sciences, Open Universiteit, Heerlen, Netherlands
4 Department of Primary and Community care, Radboud University Medical Center Nijmegen, Netherlands
Bas Schouwenberg2, Eugène Custers3, Cornelia Fluit1 and Marjolein van de Pol4
1 Research on Learning and Education, Radboudumc Health Academy, Radboud University Medical Center Nijmegen, Netherlands
2 Department of Pharmacology and Toxicology and Department of Internal Medicine, Radboud University Medical Center Nijmegen, the Netherlands.
3 Department of Online Learning and Instruction, Faculty of Educational Sciences, Open Universiteit, Heerlen, Netherlands
4 Department of Primary and Community care, Radboud University Medical Center Nijmegen, Netherlands
BACKGROUND:
Recently, a new digital clinical reasoning test (DCRT) was developed to evaluate students’ clinical-reasoning skills, using six different question types[1]. Although an assessment tool may be soundly constructed, it may still prove inadequate in practice by failing to function as intended. Therefore, more insight is needed into the effects of the DCRT in practice.
Recently, a new digital clinical reasoning test (DCRT) was developed to evaluate students’ clinical-reasoning skills, using six different question types[1]. Although an assessment tool may be soundly constructed, it may still prove inadequate in practice by failing to function as intended. Therefore, more insight is needed into the effects of the DCRT in practice.
SUMMARY OF WORK:
Individual semi-structured interviews and template analysis were used to collect and process qualitative data. The template, based on the interview guide, contained six themes: (1) DCRT itself, (2) test debriefing, (3) reflection, (4) practice/workplace, (5) DCRT versus practice and (6) ‘other’.
Individual semi-structured interviews and template analysis were used to collect and process qualitative data. The template, based on the interview guide, contained six themes: (1) DCRT itself, (2) test debriefing, (3) reflection, (4) practice/workplace, (5) DCRT versus practice and (6) ‘other’.
RESULTS/DISCUSSION:
Thirteen students were interviewed. The DCRT encourages students to engage more in formal education, self-study and workplace learning during their clerkships, particularly for those who received insufficient results. Although the faculty emphasizes the different purposes of the DCRT (assessment of/as/for learning)[2], most students perceive the DCRT as an assessment of learning. This affects their motivation and the role they assign to it in their learning process. Although students appreciate the debriefing and reflection report for improvement, they struggle to fill the identified knowledge gaps due to the timing of receiving their results. Some students are supported by the DCRT in exhibiting lifelong learning behavior.
Thirteen students were interviewed. The DCRT encourages students to engage more in formal education, self-study and workplace learning during their clerkships, particularly for those who received insufficient results. Although the faculty emphasizes the different purposes of the DCRT (assessment of/as/for learning)[2], most students perceive the DCRT as an assessment of learning. This affects their motivation and the role they assign to it in their learning process. Although students appreciate the debriefing and reflection report for improvement, they struggle to fill the identified knowledge gaps due to the timing of receiving their results. Some students are supported by the DCRT in exhibiting lifelong learning behavior.
CONCLUSION/IMPLICATIONS:
This study has identified several ways in which the DCRT influences students’ learning practices in a way that can benefit their clinical-reasoning skills. It also highlights the importance of aligning theoretical principles with practice in both the development and implementation of assessment tools as well as the content of such tools. Further research is needed to investigate the long-term impact of the DCRT on young physicians’ working practice and to evaluate the DCRT through the collection of more validation arguments, for example by employing Kane’s validity perspective[3].
This study has identified several ways in which the DCRT influences students’ learning practices in a way that can benefit their clinical-reasoning skills. It also highlights the importance of aligning theoretical principles with practice in both the development and implementation of assessment tools as well as the content of such tools. Further research is needed to investigate the long-term impact of the DCRT on young physicians’ working practice and to evaluate the DCRT through the collection of more validation arguments, for example by employing Kane’s validity perspective[3].
References (maximum three)
1. Daniel M, Rencic J, Durning SJ, Holmboe E, Santen SA, Lang V, et al. Clinical Reasoning Assessment Methods: A Scoping Review and Practical Guidance. Academic Medicine. 2019;94(6):902-12.
2. Winter J. The changing prepositions of assessment practice: assessment of, for and as learning. Br Educ Res J. 2003;29(5):767-72.
3. Kane M. Validating score interpretations and uses. Language Testing. 2011;29(1):3- 17.
2:00 pm
Martin Pusic1,2
Yoon Soo Park3
1 American Board of Medical Specialties
2 Harvard Medical School
3 University of Illinois Chicago School of Medicine
Yoon Soo Park3
1 American Board of Medical Specialties
2 Harvard Medical School
3 University of Illinois Chicago School of Medicine
Background:
In assessment, it can be difficult to disentangle what is uncertainty due to incomplete mastery and what is structural uncertainty where even a fully trained expert would be uncertain. Here we use item-response models to quantify diagnostic uncertainty at the clinician and case levels.
Summary of Work:
Forty dermatologists rated 100 images of skin lesions that might be diagnosed as malignant melanoma using a dichotomous categorization of either “no further treatment” (NFT) or “biopsy/further treatment” (Bx). We modeled the resulting fully crossed data (40 raters x100 pictures of skin lesions) using three approaches: a Rasch Model, a Signal Detection Model, and a Graded Response Model.
Results:
Under the Rasch model, the 100 cases demonstrated a full range of diagnostic uncertainty, from -4.18 logits (all dermatologists predicted to rate “Bx”) to +4.20 logits (all dermatologists predicted to rate “NFT”). 14 of the cases fell within 0.5 logits of the 0 mid-point where a dermatologist of average bias would be predicted to be equally likely to endorse either category, suggesting a significant proportion of cases with uncertainty for all practitioners. Several of the ultimately benign cases showed ratings consistent with malignancy. Modelling practitioners, we found that they demonstrated considerable practice variation in where they set their biopsy cut points (See Figure). Signal Detection and Graded Response Model results are found to be complementary to those of the Rasch Model.
Discussion:
Item response modeling, when aligned with a clinical decision such as whether to biopsy a case of potential melanoma, can be used to provide feedback as to a clinician’s tendencies and how they would be predicted to respond to individual cases.
Conclusions:
The presented work is an advance in that it allows case by case interpretation of an individual’s decision threshold, all taken in the context of demonstrated practice variation.
References (maximum three)
Baldwin P, Bernstein J, Wainer H. Hip psychometrics. Stat Med. 2009;28(17):2277-2292. doi:10.1002/sim.3616
Pusic MV, Rapkiewicz A, Raykov T, Melamed J. "Estimating the Irreducible Uncertainty in Visual Diagnosis: Statistical Modeling of Skill Using Response Models". Accepted for publication in Medical Decision Making Feb 17, 2023.
Pusic M, Cook D, Friedman J, et al. Modeling diagnostic expertise in cases of irreducible uncertainty: the decision aligned response model. Acad Med. 2022;98(1):88–97.
2:15 pm
Amanda Edgar1
Suzanne Estaphan2, Luke Chong3, James A Armitage3, Lucy Ainge3 and Gerard Corrigan4
1 Deakin Learning Futures, Deakin University
2 School of Medicine and Psychology, The Australian National University
3 School of Medicine, Deakin University
4 School of Rural Medicine, Charles Sturt University
Suzanne Estaphan2, Luke Chong3, James A Armitage3, Lucy Ainge3 and Gerard Corrigan4
1 Deakin Learning Futures, Deakin University
2 School of Medicine and Psychology, The Australian National University
3 School of Medicine, Deakin University
4 School of Rural Medicine, Charles Sturt University
Background:
Many assessment methods focus on the high level elements of clinical reasoning, such as information gathering, generating differential diagnoses and developing targeted management plans (Daniel, Rencic et al. 2019). In their exploration of the diagnosis and management of clinical reasoning difficulties, Audetat and colleagues argue there should be a focus on the reasoning processes (Audetat, Laurin et al. 2017). This study was designed to understand less accessible elements of clinical reasoning during assessment including the reasoning process.
Many assessment methods focus on the high level elements of clinical reasoning, such as information gathering, generating differential diagnoses and developing targeted management plans (Daniel, Rencic et al. 2019). In their exploration of the diagnosis and management of clinical reasoning difficulties, Audetat and colleagues argue there should be a focus on the reasoning processes (Audetat, Laurin et al. 2017). This study was designed to understand less accessible elements of clinical reasoning during assessment including the reasoning process.
Summary of work:
Three students completed a digital simulated authentic clinical reasoning assessment task. Process mapping (Smith and Corrigan, 2018), a form of microanalysis, was used to investigate the clinical reasoning process, using decisions students made whilst completing the assessment task. The process map data were analysed and coded.
Three students completed a digital simulated authentic clinical reasoning assessment task. Process mapping (Smith and Corrigan, 2018), a form of microanalysis, was used to investigate the clinical reasoning process, using decisions students made whilst completing the assessment task. The process map data were analysed and coded.
Results:
17 process maps were generated for participant 1, and 18 for participants 2 and 3, totalling 53 process maps. A codebook was generated, including 4 codes and 27 sub-codes.
Discussion:
This study demonstrated that process mapping can be used to investigate the clinical reasoning processes optometry students use in a digitally simulated authentic clinical case. The codebook generated in this study can be used in future iterations of microanalytic research incorporating process mapping and clinical reasoning in the context of optometry.
This study demonstrated that process mapping can be used to investigate the clinical reasoning processes optometry students use in a digitally simulated authentic clinical case. The codebook generated in this study can be used in future iterations of microanalytic research incorporating process mapping and clinical reasoning in the context of optometry.
Conclusions:
Students are utilising clinical reasoning to make conscious decisions and process mapping has the potential to investigate the underlying process further including the identification of heuristics and biases used to make decisions in the simulated clinical case.
Students are utilising clinical reasoning to make conscious decisions and process mapping has the potential to investigate the underlying process further including the identification of heuristics and biases used to make decisions in the simulated clinical case.
Implications for practice and further research:
Future research could also focus on uncovering how process maps can be used as a reflective tool to provide feedback and feed forward into future assessments and clinical experiences.
Future research could also focus on uncovering how process maps can be used as a reflective tool to provide feedback and feed forward into future assessments and clinical experiences.
References (maximum three)
Audetat, M. C., S. Laurin, V. Dory, B. Charlin and M. R. Nendaz (2017). "Diagnosis and management of clinical reasoning difficulties: Part I. Clinical reasoning supervision and educational diagnosis." Med Teach: 1-5.
Daniel, M., J. Rencic, S. J. Durning, E. Holmboe, S. A. Santen, V. Lang, T. Ratcliffe, D. Gordon, B. Heist, S. Lubarsky, C. A. Estrada, T. Ballard, A. R. Artino, Jr., A. Sergio Da Silva, T. Cleary, J. Stojan and L. D. Gruppen (2019). "Clinical Reasoning Assessment Methods: A Scoping Review and Practical Guidance." Acad Med 94(6): 902-912.
Smith, P. and G. Corrigan (2018). "How learners learn: A new microanalytic assessment method to map decision-making." Med Teach: 1-9.
2:30 pm
Johanna Klutmann1
Constanze Dietzsch1, Ute Schlasius-Ratter2, Alexander Oksche2, Sara Volz-Willems1, Johannes Jäger1 and Fabian Dupont1
1 Department of Family Medicine, Saarland University
2 German Institute for State Examinations in Medicine, Pharmacy, Dentistry and Psychotherapy (IMPP), Mainz, Germany
Constanze Dietzsch1, Ute Schlasius-Ratter2, Alexander Oksche2, Sara Volz-Willems1, Johannes Jäger1 and Fabian Dupont1
1 Department of Family Medicine, Saarland University
2 German Institute for State Examinations in Medicine, Pharmacy, Dentistry and Psychotherapy (IMPP), Mainz, Germany
Background:
Clinical reasoning is a vital skill in medical education, encompassing various cognitive processes such as observation, historical information elicitation, physical maneuvers, hypothesis generation, and diagnostic test ordering (1). Several researchers have tried to make clinical reasoning as a cognitive process visible during assessment (2, 3). This study aims to visualize, categorize and compare clinical reasoning (strategies) in tablet-based multiple- choice question (MCQ) assessments and explore the thought process of academic high achievers and their correct or incorrect reasoning.
Summary of Work:
During winter-semester 2022/23, and summer-semester 2023 two times100 fifth-year medical students participated in the Year 5 Family-Medicine curriculum at Saarland University, Germany, culminating in a state-exam question-based exam. The exam comprised 60 MCQ- questions, including two two-step key-feature-questions. Self-assessment questions on clinical- reasoning were included after each MCQ. Literature-based deductive content-analysis was conducted, involving researcher triangulation to ensure consistency. The same process was repeated for the subsequent cohort in the summer-semester 2023.
Results:
This study introduces a novel approach to measuring clinical-reasoning in tablet-based-MCQ- assessment, shedding light on the thought-process during exams. It helps identify reasons for errors and whether students apply clinical-reasoning during exams. Furthermore, it may reveal whether high-achieving students demonstrate more forward reasoning, considered applicable in real-life medical scenarios. (to-be-completed)
Discussion:
The study underscores the importance of thought processes within clinical reasoning in a competency-based curriculum. It might help align teaching with assessment strategies to promote desirable clinical reasoning skills. It also highlights exam question quality weaknesses and shows clinical reasoning performance processes beyond summative results.
Conclusion:
Understanding clinical reasoning in medical education MCQ assessment helps to continuously improve MCQ and potentially reach higher competency-levels by continuously improving question- and assessment-formats. Incorporating such visualization exercises of clinical reasoning may lead to improved exam question quality and foster a clearer focus on forward- clinical-reasoning as a desirable thought-process among students.
References (maximum three)
- Eva KW (2005) What every teacher needs to know about clinical reasoning. Med Educ 39:98–106
- Beullens J, Struyf E, Van Damme B (2005) Do extended matching multiple-choice questions measure clinical reasoning? Med Educ 39:410–417
- Hrynchak P, Takahashi SG, Nayer M: Key-feature questions for assessment of clinical reasoning: a literature review. Med Educ 2014; 48: 870–83
2:45 pm
Ngoc-Thanh-Van Nguyen1
Sy Van Hoang1, Duc Trong Quach1, Khanh Duc Nguyen1, Thi-My-Hanh Nguyen1, Lam Ho Nguyen1, Kha Minh Nguyen1, Thi-Bich-Thuy Van1, Duy Cong Tran1, Tran-Tuyet-Trinh Nguyen1, Tuan Thanh Tran1 and Hoa Ngoc Chau1
1 University of Medicine and Pharmacy at Ho Chi Minh city
Sy Van Hoang1, Duc Trong Quach1, Khanh Duc Nguyen1, Thi-My-Hanh Nguyen1, Lam Ho Nguyen1, Kha Minh Nguyen1, Thi-Bich-Thuy Van1, Duy Cong Tran1, Tran-Tuyet-Trinh Nguyen1, Tuan Thanh Tran1 and Hoa Ngoc Chau1
1 University of Medicine and Pharmacy at Ho Chi Minh city
Critical reasoning is a crucial competence of medical graduates. Current OSCE grading in Vietnam’s medical schools are heavily checklist-based, emphasizing more on the number of performed items, rather than the comprehensive and logical approach to arrive at the most probable diagnoses. Furthermore, a conventional fixed cutscore (4 out of 10) was applied regardless of cohorts and testing materials. We conducted the first Vietnamese study to improve standard setting in summative OSCE cardiovascular station, focusing on critical reasoning.
We had two types of summative OSCE. One focused on history-taking. The other required students to came up with reasonable diagnoses and order appropriate investigations using information provided. We revised the OSCE questions in both types. For the former, at the end of history-taking, learners must conclude most probable diagnoses with reasons. For the latter, more complex scenarios were given to evaluate high-order thinking. To execute the change, we trained subject matter experts, formed Clinical Competence Committee, revised all OSCE questions, checklists and global ratings. Borderline regression group method (BRGM) was used to establish the new passing score. Content Validity Index was 100% for each item and the entire checklists.
296 third-year students were evaluated in academic year 2022-2023. In both semesters, BRGM-derived passing scores were 5 out of 10, higher than fixed cutscore. Consequently, passing rates were lower with BRGM (82.8% vs 56.5% in semester 1 and 100% vs 89.2% in semester 2) compared to conventional method. Difference between cutscores were attributed to disparity of item weights between checklists and borderline group definition. Items relating to critical reasoning were given inadequate weights in checklists. Therefore, even if learners missed these critical items, they would still able to score more than 4 and passed the exam.
High-stake examination should be regularly standardized to accurately assess critical reasoning, distinguishing between competent and incompetent candidates.
References (maximum three)
1. Javaeed A. Assessment of Higher Ordered Thinking in Medical Education: Multiple Choice Questions and Modified Essay Questions [version 1]. MedEdPublish 2018, 7:128 (https://doi.org/10.15694/mep.2018.0000128.1)
2. Lai, JH., Cheng, KH., Wu, YJ. et al. Assessing clinical reasoning ability in fourth-year medical students via an integrative group history-taking with an individual reasoning activity. BMC Med Educ 22, 573 (2022). https://doi.org/10.1186/s12909-022-03649-4
3:00 pm
Douglas Wong1
Breanna Wright1
1 Victoria University
Breanna Wright1
1 Victoria University
Background
Educators encounter common challenges when designing meaningful assessments of clinical reasoning. Assessing competence in students with limited real-life exposure to patients from diverse backgrounds is a familiar issue, and the recent rise of artificial intelligence presents an additional complication.
The Clinical Reasoning Task (CRT) is an oral assessment strategy designed to foster clinical reasoning and build patient centered communication skills across a range of clinical domains. It is currently being implemented within a suite of six post-graduate units in the Victoria University osteopathy program. Students complete this course via intensive delivery. This assessment engages students with scenarios that they are not guaranteed to encounter whilst on clinical placement – including complex pain presentations and the management of diverse patient populations, including First Nations Peoples.
Results
Evaluation of the CRT is ongoing. Initial qualitative feedback suggests learners appreciate being exposed to patient interactions that might not be encountered until after graduation. Informal feedback from staff members has also indicated heightened student in-class engagement and attendance.
Evaluation of the CRT is ongoing. Initial qualitative feedback suggests learners appreciate being exposed to patient interactions that might not be encountered until after graduation. Informal feedback from staff members has also indicated heightened student in-class engagement and attendance.
Discussion
The CRT offers a viable solution in resource-constrained settings and an alternative to the challenges posed by long-form case studies, which can be problematic to implement within an intensive delivery mode. Our approach engages students in clinical reasoning at regular checkpoints and is complementary to clinical placements.
Successful completion of the CRT requires the integration of theoretical knowledge with critical thinking, decision-making, and problem-solving prowess within authentic patient scenarios. The CRT confers an added benefit through its capacity to uphold academic integrity, owing to the task's in-class modality and adherence to closed-book parameters.
Conclusion
The CRT embodies a novel approach for evaluating clinical reasoning and patient-centered communication proficiencies, with applicability extending to various health disciplines and clinical contexts. Prospective expansion of the CRT includes group-based learning exercises in the setting of interprofessional education.
References (maximum three)
Pearce, J. & Chiavaroli, N. (2023). Rethinking assessment in response to generative artificial intelligence. Medical Educator, 2023, 1-3. doi: 10.1111/medu.15092. Online ahead of print.
Young, M. E., Thomas, A., Lubarsky, S., Gordon, D., Gruppen, L. D., Rencic, J., ... & Durning, S. J. (2020). Mapping clinical reasoning literature across the health professions: a scoping review. BMC Medical Education, 20, 1-11.
3:15 pm
Barbara Masi1
Kellie Mullany2, Jaya Yodh2, Imanni Sheppard2, Grace Park2 and Samar Hegazy2
1 Virginia Maryland College of Veterinary Medicine
2 Carle Illinois College of Medicine, University of Illinois, Urbana-Champaign
Kellie Mullany2, Jaya Yodh2, Imanni Sheppard2, Grace Park2 and Samar Hegazy2
1 Virginia Maryland College of Veterinary Medicine
2 Carle Illinois College of Medicine, University of Illinois, Urbana-Champaign
BACKGROUND
Problem-based learning (PBL) enhances clinical reasoning by providing structure for knowledge acquisition/ integration and clinical reasoning skill development. Concept mapping (CM) is an evidence-based tool for learning and assessment, particularly in PBL. There is interest in integrating the social determinants of health (SDOH) framework in medical education to enhance patient-centered healthcare rather than disease-focused care. However, there are no clear standards for integrating/assessing the impact on clinical reasoning and problem-solving.
Problem-based learning (PBL) enhances clinical reasoning by providing structure for knowledge acquisition/ integration and clinical reasoning skill development. Concept mapping (CM) is an evidence-based tool for learning and assessment, particularly in PBL. There is interest in integrating the social determinants of health (SDOH) framework in medical education to enhance patient-centered healthcare rather than disease-focused care. However, there are no clear standards for integrating/assessing the impact on clinical reasoning and problem-solving.
SUMMARY-OF-WORK
This work investigated the impact of integrating the SDOH framework into PBL on students' clinical reasoning and problem-solving skills towards enhancing patient-centered healthcare. CM assignment, scoring rubric, and oral reflection on CM development were used to assess students' problem-solving strategies. Fourteen student volunteers were randomly assigned to control (CG) and experimental (EG) groups. CG explored two cases with clinical/basic science probes. EG explored the same cases with additional probes linking the SDOH-framework and completed group concept maps (CMs) to integrate concepts. To assess the intervention's impact, both groups were given new case vignettes to create individual treatment plan CMs supported by recorded narratives. Narrative transcripts were analyzed using inductive coding.
This work investigated the impact of integrating the SDOH framework into PBL on students' clinical reasoning and problem-solving skills towards enhancing patient-centered healthcare. CM assignment, scoring rubric, and oral reflection on CM development were used to assess students' problem-solving strategies. Fourteen student volunteers were randomly assigned to control (CG) and experimental (EG) groups. CG explored two cases with clinical/basic science probes. EG explored the same cases with additional probes linking the SDOH-framework and completed group concept maps (CMs) to integrate concepts. To assess the intervention's impact, both groups were given new case vignettes to create individual treatment plan CMs supported by recorded narratives. Narrative transcripts were analyzed using inductive coding.
RESULTS
Inductive coding of 28 total narratives identified four themes that align with clinical problem-solving and SDOH-framework: 1-Framing-Case, 2-Clinical-Reasoning-to - Diagnosis, 3-Clinical-Reasoning to-Treatment-Plan, 4-SDOH-Framework-Informed. Qualitative analysis revealed group differences in clinical problem-solving approaches. EG followed an SDOH-informed clinical reasoning approach to patient-centered treatment plans with balanced therapeutic/non-therapeutic elements. CG followed a clinical reasoning approach to a plan focusing mainly on the clinical aspects of the case.
Inductive coding of 28 total narratives identified four themes that align with clinical problem-solving and SDOH-framework: 1-Framing-Case, 2-Clinical-Reasoning-to - Diagnosis, 3-Clinical-Reasoning to-Treatment-Plan, 4-SDOH-Framework-Informed. Qualitative analysis revealed group differences in clinical problem-solving approaches. EG followed an SDOH-informed clinical reasoning approach to patient-centered treatment plans with balanced therapeutic/non-therapeutic elements. CG followed a clinical reasoning approach to a plan focusing mainly on the clinical aspects of the case.
DISCUSSION
SDOH integration into PBL using CM with narratives helped students visualize and reflect on integrated concepts, thus fostering patient-focused clinical reasoning skills.
SDOH integration into PBL using CM with narratives helped students visualize and reflect on integrated concepts, thus fostering patient-focused clinical reasoning skills.
CONCLUSIONS
This Curricular approach enhances student ability to provide patient- centered care by applying SDOH-informed clinical reasoning.
This Curricular approach enhances student ability to provide patient- centered care by applying SDOH-informed clinical reasoning.
TAKE-HOME MESSAGE
This novel approach provides an effective mechanism for SDOH integration in medical education toward developing patient-centered physicians.
This novel approach provides an effective mechanism for SDOH integration in medical education toward developing patient-centered physicians.
References (maximum three)
Hung CH, Lin CY. Using concept mapping to evaluate knowledge structure in problem-based learning. BMC Med Educ. 2015 Nov 27;15:212. doi: 10.1186/s12909-015-0496-x. PMID: 26614519; PMCID: PMC4662011.
Doobay-Persaud A, Adler MD, Bartell TR, Sheneman NE, Martinez MD, Mangold KA, Smith P, Sheehan KM. Teaching the Social Determinants of Health in Undergraduate Medical Education: a Scoping Review. J Gen Intern Med. 2019 May;34(5):720-730. doi: 10.1007/s11606-019-04876-0. PMID: 30993619; PMCID: PMC6502919.Gruppen, Clinical Reasoning: Defining It, Teaching It, Assessing It, Studying It, 2018
Daniel M, Rencic J, Durning SJ, Holmboe E, Santen SA, Lang V, Ratcliffe T, Gordon D, Heist B, Lubarsky S, Estrada CA, Ballard T, Artino AR Jr, Sergio Da Silva A, Cleary T, Stojan J, Gruppen LD. Clinical Reasoning Assessment Methods: A Scoping Review and Practical Guidance. Acad Med. 2019 Jun;94(6):902-912. doi: 10.1097/ACM.0000000000002618. PMID: 30720527.