ePoster
100% Page: /
Presentation Description
Sreedhar Radhika1
Linda Chang1, Sarah Donohue1, Ananya Gangopadhyaya1, Peggy Woziwodzki Shiels1, Asra Khan1, Laura McKenzie1, and Yoon Soo Park1,
1 University of Illinois College of Medicine
Linda Chang1, Sarah Donohue1, Ananya Gangopadhyaya1, Peggy Woziwodzki Shiels1, Asra Khan1, Laura McKenzie1, and Yoon Soo Park1,
1 University of Illinois College of Medicine
Background
There are few Entrustable Professional Activity (EPA) -based assessments that evaluate the bedside practice of evidence-based medicine (EBM), and fewer “systems” of assessments incorporating EBM knowledge and performance. We created an assessment system, to evaluate medical student readiness to practice key components of EPA7 -“Form Clinical Questions and Retrieve Evidence to Advance Patient Care.”
There are few Entrustable Professional Activity (EPA) -based assessments that evaluate the bedside practice of evidence-based medicine (EBM), and fewer “systems” of assessments incorporating EBM knowledge and performance. We created an assessment system, to evaluate medical student readiness to practice key components of EPA7 -“Form Clinical Questions and Retrieve Evidence to Advance Patient Care.”
Summary of Work
The assessment system created by consensus, consisted of a multiple-choice quiz and performance-based simulation, administered to medical students transitioning to clerkships. The 26-item quiz measured individual knowledge and skills in the Ask, Acquire, Appraise and Advise aspects of EPA 7. Triads of students participated in the simulation which measured team-based ability to appraise information and advice a standardized patient (SP) about a therapy.
Results
298/306 (97%) students participated in the quiz and simulation. 75% of students passed the quiz (mean score 71.5 ∓ 10.73%), with 62% and 66% of students correctly answering the questions in the Appraise and Advise categories.
Correlations between student performance on individual quiz items on appraise and advise categories and team based performance assessment did not show significant associations (Appraise: r = 0.05, P = .546; Advise: r = 0.09, P = .234). Learner and SP satisfaction with ability to convey and understand the information respectively showed strong association (r = 0.70, P < .001).
Discussion
This assessment system demonstrates that student knowledge of how to appraise information and apply it to patients as measured in the quiz did not translate to the intended effect of communicating that information to SP.
Conclusion
Simulations help identify gaps in student teams’ ability to apply EBM to patient care.
Take-home messages / implications for further research or practice
Assessing medical student readiness for appraising information and communicating this with patients using simulation should be considered in assessment of student competence.
References (maximum three)
1. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, Shaneyfelt T, Haynes RB, Guyatt G, Glasziou P. Core Competencies in Evidence-Based Practice for Health Professionals: Consensus Statement Based on a Systematic Review and Delphi Survey. JAMA Network Open. 2018 June 1;1(2):e180281.
2. Kyriakoulis K, Patelarou A, Laliotis A, Wan AC, Matalliotakis M, Tsiou C, Patelarou E. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. J Educ Eval Health Prof. 2016 Sep 22;13:34. doi: 10.3352/jeehp.2016.13.34. PMID: 27649902;
3. Kumaravel B, Stewart C, Ilic D. Development and evaluation of a spiral model of assessing EBM competency using OSCEs in undergraduate medical education. BMC Med Educ. 2021 Apr 10;21(1):204. doi: 10.1186/s12909-021-02650-7. PMID: 33838686;