Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Competency and performance-based assessments
E Poster
ePoster
4:00 pm
26 February 2024
Exhibition Hall (Poster 2)
Session Program
4:00 pm
Marika Wrzosek1
Kathleen Beckmann1, Leslie Ruffalo1, MaryAnn Gilligan1, Carley Sauter1, Cynthia Kay1, Ashley Pavlic1, Sarah Vepraskas1, Hari Paude1, and Heather Aschoff1,
1 Medical College of Wisconsin
Kathleen Beckmann1, Leslie Ruffalo1, MaryAnn Gilligan1, Carley Sauter1, Cynthia Kay1, Ashley Pavlic1, Sarah Vepraskas1, Hari Paude1, and Heather Aschoff1,
1 Medical College of Wisconsin
Background:
Assessing medical students’ competency is a challenging yet desirable goal in undergraduate medical training. Many institutions lack a consistent approach to share competency progression data with learners in a formative way. In response, our Continuous Professional Development (CPD) Course, which spans years three and four of medical school, utilizes spider graphs to track medical students’ professional development along eight institutional global competencies. The goal of this poster is to share our process for generating and sharing student competency data in this visual format.
Assessing medical students’ competency is a challenging yet desirable goal in undergraduate medical training. Many institutions lack a consistent approach to share competency progression data with learners in a formative way. In response, our Continuous Professional Development (CPD) Course, which spans years three and four of medical school, utilizes spider graphs to track medical students’ professional development along eight institutional global competencies. The goal of this poster is to share our process for generating and sharing student competency data in this visual format.
Summary of work:
Students are assigned a CPD faculty director who monitors and tracks their competency progression. Each competency is rated on a Likert scale of 0-5 (0=no competency, 5=full competency), which is pulled from evaluations by clinical preceptors. The CPD team works with a data analyst to synthesize competency data, now representing multiple data points over time.
Results:
This results in a visual “spider graph” depicting 1) an average of clinical preceptors’ objective ratings of the learner for each competency, 2) the class average on each competency, and 3) the student’s self-evaluated score on each competency.
This results in a visual “spider graph” depicting 1) an average of clinical preceptors’ objective ratings of the learner for each competency, 2) the class average on each competency, and 3) the student’s self-evaluated score on each competency.
Discussion:
The resulting visual spider graph deftly depicts a learner’s progress, which is important for both learners and advisors. The collective view of subjective and objective performance over time gives an opportunity to strengthen deficiencies before progressing to residency application.
The resulting visual spider graph deftly depicts a learner’s progress, which is important for both learners and advisors. The collective view of subjective and objective performance over time gives an opportunity to strengthen deficiencies before progressing to residency application.
Conclusion:
Our standardized process to assess competency in undergraduate education has enhanced CPD advisors’ ability to normalize student experiences, identify areas of relative strength and weakness, compare individual students to peers, and assess trends in student self- assessment.
Our standardized process to assess competency in undergraduate education has enhanced CPD advisors’ ability to normalize student experiences, identify areas of relative strength and weakness, compare individual students to peers, and assess trends in student self- assessment.
Take home points:
Competency-based education in medical education is on the rise and in demand. Our CPD course highlights a method by which students’ competency data informs formative feedback as learners progress across all clinical rotations.
Competency-based education in medical education is on the rise and in demand. Our CPD course highlights a method by which students’ competency data informs formative feedback as learners progress across all clinical rotations.
4:05 pm
Hany Atwa1
Asmaa Abdelnasser2, Asser Sallam2 and Adel Abdelaziz2
1 College of Medicine and Medical Sciences, Arabian Gulf University
2 Faculty of Medicine, Suez Canal University
Asmaa Abdelnasser2, Asser Sallam2 and Adel Abdelaziz2
1 College of Medicine and Medical Sciences, Arabian Gulf University
2 Faculty of Medicine, Suez Canal University
Introduction:
OSCE is a well-known method of assessment of clinical skills. This study aimed to explore the perception of fifth-year medical students of the newly implemented end-of- rotation Orthopedics Surgery and Trauma OSCE held at Suez Canal Medical School in Egypt.
OSCE is a well-known method of assessment of clinical skills. This study aimed to explore the perception of fifth-year medical students of the newly implemented end-of- rotation Orthopedics Surgery and Trauma OSCE held at Suez Canal Medical School in Egypt.
Summary of Work:
This is a mixed-methods study that employed a convenient sample of 254 fifth-year medical students who underwent OSCE at the end of their Orthopedic Surgery and Trauma rotation. Quantitative data was collected through a validated questionnaire comprising of 32 items. Focus group discussions were conducted, and qualitative data was recorded, coded, and thematically analyzed.
This is a mixed-methods study that employed a convenient sample of 254 fifth-year medical students who underwent OSCE at the end of their Orthopedic Surgery and Trauma rotation. Quantitative data was collected through a validated questionnaire comprising of 32 items. Focus group discussions were conducted, and qualitative data was recorded, coded, and thematically analyzed.
Results:
Over half of the students (55.5%) believed that the exam was fair and covered a wide range of clinical skills (72.4%). Considerable percentages of students were doubtful regarding the standardization of OSCE scores (62.6%) and whether those scores provided a true measurement of their clinical skills (65%). More than half of them were not sure whether gender, personality, or ethnicity affected their exam scores (55.5%). Qualitative analysis identified two themes; namely: “Challenges of implementing OSCE” and “Ways to overcome identified challenges”.
Over half of the students (55.5%) believed that the exam was fair and covered a wide range of clinical skills (72.4%). Considerable percentages of students were doubtful regarding the standardization of OSCE scores (62.6%) and whether those scores provided a true measurement of their clinical skills (65%). More than half of them were not sure whether gender, personality, or ethnicity affected their exam scores (55.5%). Qualitative analysis identified two themes; namely: “Challenges of implementing OSCE” and “Ways to overcome identified challenges”.
Discussion:
This study found that the attributes, quality of performance, validity/reliability criteria and organization/settings of the newly implemented OSCE were positively perceived by the students. Several studies that explored students’ experiences with new OSCE encounters in different educational settings found similar results. This might be attributed to the structured and objective nature of OSCE.
This study found that the attributes, quality of performance, validity/reliability criteria and organization/settings of the newly implemented OSCE were positively perceived by the students. Several studies that explored students’ experiences with new OSCE encounters in different educational settings found similar results. This might be attributed to the structured and objective nature of OSCE.
Conclusion:
Medical students in our study positively perceived the organization and implementation of the Orthopedics OSCE, although some of them were doubtful regarding its validity and reliability in assessing their clinical skills in Orthopedics and Trauma.
Medical students in our study positively perceived the organization and implementation of the Orthopedics OSCE, although some of them were doubtful regarding its validity and reliability in assessing their clinical skills in Orthopedics and Trauma.
Take-home Message:
The implementation of an OSCE in Orthopedic Surgery and Trauma can be effective in providing medical students with valuable learning experience and assessing their clinical skills.
The implementation of an OSCE in Orthopedic Surgery and Trauma can be effective in providing medical students with valuable learning experience and assessing their clinical skills.
References (maximum three)
- Rentschler DD, Eaton J, Cappiello J, McNally SF, McWilliam P. Evaluation of undergraduate students using objective structured clinical evaluation. J Nurs Educ. 2007;46:135–9.
- El-Nemer A, Kandeel N. Using OSCE as an assessment tool for clinical skills: Nursing students’ feedback. Aust J Basic Appl Sci. 2009;3:2465–72.
- Taylor CA, Green KE. OSCE feedback: A randomized trial of effectiveness, cost- effectiveness and student satisfaction. Creat Educ. 2013;4:9.
4:10 pm
Hany Atwa1
Adel Abdelaziz2 and Mohamed Hany Shehata1
1 Arabian Gulf University, Kingdom of Bahrain
2 Suez Canal University, Egypt
Adel Abdelaziz2 and Mohamed Hany Shehata1
1 Arabian Gulf University, Kingdom of Bahrain
2 Suez Canal University, Egypt
Introduction:
In ordinary circumstances, objective structured clinical examination (OSCE) is a resource-intensive assessment method. When developing and implementing multidisciplinary OSCE, there is no doubt that the cost will be greater. Through this study a research project was conducted to develop, implement, and evaluate a multidisciplinary OSCE model within limited resources at the Faculty of Medicine, Suez Canal University (FOM-SCU), Egypt.
In ordinary circumstances, objective structured clinical examination (OSCE) is a resource-intensive assessment method. When developing and implementing multidisciplinary OSCE, there is no doubt that the cost will be greater. Through this study a research project was conducted to develop, implement, and evaluate a multidisciplinary OSCE model within limited resources at the Faculty of Medicine, Suez Canal University (FOM-SCU), Egypt.
Summary of Work:
This research project went through the steps of a) blueprinting of the clinical part of the program to guarantee content validity, b) station writing, c) resource reallocation, where the available resources within the FOM-SCU premises were redistributed and reused without the requisition of any further resources, d) implementation, and evaluation. Steps a and b were carried out by medical education experts with expertise in OSCE.
This research project went through the steps of a) blueprinting of the clinical part of the program to guarantee content validity, b) station writing, c) resource reallocation, where the available resources within the FOM-SCU premises were redistributed and reused without the requisition of any further resources, d) implementation, and evaluation. Steps a and b were carried out by medical education experts with expertise in OSCE.
Results:
The developed model was implemented in the Primary Health Care (PHC) program which is one of the pillars of the community-based undergraduate curriculum of the FOM- SCU. Data for evaluation of the implemented OSCE model were derived from two resources. First, feedback was obtained from the students and assessors through self-administered questionnaires. Second, evaluation of the OSCE psychometrics was done. The deliverables of this research project included a set of validated integrated multi-disciplinary and low-cost OSCE stations with an estimated reliability index of 0.6.
The developed model was implemented in the Primary Health Care (PHC) program which is one of the pillars of the community-based undergraduate curriculum of the FOM- SCU. Data for evaluation of the implemented OSCE model were derived from two resources. First, feedback was obtained from the students and assessors through self-administered questionnaires. Second, evaluation of the OSCE psychometrics was done. The deliverables of this research project included a set of validated integrated multi-disciplinary and low-cost OSCE stations with an estimated reliability index of 0.6.
Conclusion:
After having this experience, we have a critical mass of faculty members trained on blueprinting and station writing and a group of trained assessors, facilitators, and role players. Also, there is a state of awareness among students on how to proceed in this type of OSCE which renders future implementation more feasible.
After having this experience, we have a critical mass of faculty members trained on blueprinting and station writing and a group of trained assessors, facilitators, and role players. Also, there is a state of awareness among students on how to proceed in this type of OSCE which renders future implementation more feasible.
Take-home Message:
Introducing integrated multi-disciplinary OSCE is feasible under limited resources and would improve the acceptability, objectivity, validity, and reliability of the assessment system at medical schools.
Introducing integrated multi-disciplinary OSCE is feasible under limited resources and would improve the acceptability, objectivity, validity, and reliability of the assessment system at medical schools.
References (maximum three)
- Harden RM, Glesson FA. 1979. Assessment of clinical competence using an objective structured clinical examination. Med Educ 13:41–54.
- Newble D, Reed M. 2005. Developing and running an objective structured clinical examination (OSCE). Academic Unit of Medical Education. The University of Sheffield Publications 34:45–48.
- Austin Z, O’Byrne C, Pugsley J, Quero L. 2003. Development and validation processes for an objective structured clinical examination (OSCE) for entry-to-practice certification in pharmacy. The Canadian experience. Am J Pharm Educ 67(3): Article 76.
4:15 pm
Dayle Soong1
1 Adelaide Rural Clinical School, The University of Adelaide
Background
Workplace Based Assessments (WBAs) such as MiniCEX, Multisource Feedback (MSF) and Direct Observation of Procedures (DOPs) have been utilised by the Adelaide Rural Clinical School (ARCS) for many years, with paper forms being used to document feedback. Students can now collect formative feedback and complete summative WBAs online during their rural placement.
1 Adelaide Rural Clinical School, The University of Adelaide
Background
Workplace Based Assessments (WBAs) such as MiniCEX, Multisource Feedback (MSF) and Direct Observation of Procedures (DOPs) have been utilised by the Adelaide Rural Clinical School (ARCS) for many years, with paper forms being used to document feedback. Students can now collect formative feedback and complete summative WBAs online during their rural placement.
Summary
This project involved the development of a system to enable rural clinical supervisors and students to access online WBA forms at any time, in any place and on any device. A solution was designed using QR codes that link to the ‘ARCS eForm’ where supervisors can complete the relevant assessment and provide feedback in real-time. A pdf of the response data is automatically generated and copies are emailed to the student, the supervisor and the program coordinator.
Results
We have had 90 students use the system and have received 471 WBA responses to date. We have found that most students are using the eForm, however, we are still receiving some paper forms from some students and supervisors.
Discussion
Implementing the system required careful design of the user experience for both students and supervisors. Consideration needed to be given to ensure the ARCS eForm was easy for rural supervisors, but also that academic integrity was upheld and feedback was provided in an appropriate format.
Conclusions
Transitioning to online WBAs has provided many advantages including improved accessibility, convenience, storage and retrieval, mobile compatibility, customisability, automation and better reporting capabilities.
Take-home messages/implications for further research or practice
Technology has enabled the successful transition from paper WBAs to online WBAs. This has also led to form response data now being in an electronic format that can be more easily analysed. Further research is needed to analyse the quality and impact of the WBA feedback being generated by the ARCS eForm.
References (maximum three)
Lin, B., Riegels, N., Ziv, T, Wamsley, M., Sullivan, J. (2019, October 27 - 30), Use of a mobile tool linked to QR codes improves collection of formative student assessment in LICs[Conference Presentation], The Consortium of Longitudinal Integrated Clerkships (CLIC) Conference 2019, Vancouver, Canada.
4:20 pm
Chang How Choo1
Wanzhen Zhao1, Jasmine Pei Ling Boo2, Cheng Jee Goh2 and Yee Teng Ong2
1 Nursing Education & Development, Tan Tock Seng Hospital, Singapore
2 Speech Therapy, Tan Tock Seng Hospital, Singapore
Wanzhen Zhao1, Jasmine Pei Ling Boo2, Cheng Jee Goh2 and Yee Teng Ong2
1 Nursing Education & Development, Tan Tock Seng Hospital, Singapore
2 Speech Therapy, Tan Tock Seng Hospital, Singapore
Background
Healthcare Assistants (HCAs) are a group of support care staff who attend to patients’ activities of daily living (SkillsFuture, 2023). Many were employed and placed on workplace apprenticeship model, where trainees were assessed using traditional checklists. As there is a growing need for feeding supervision for patients with dysphagia (Poon, Ward & Burns, 2023), this project aims to upskill the competence of HCAs in performing supervised feeding through a blended assessment design, referencing Miller’s pyramid of assessment.
Summary of Work
Miller’s pyramid model consists of four hierarchical levels, representing the progressive stages of a healthcare provider’s clinical competence development (Ramani & Leinster, 2008). The team designed a four-tiered assessment strategy, which include knowledge-based multiple- choice e-quiz, simulated scenario applications, standardised patients exercises and direct observation in clinical areas. Over a period of four months in 2022, nurse educators and speech therapists administered the classroom-based simulated assessments, while nursing clinical champions assessed the HCAs’ observable behaviours in the clinical settings. A subsequent online survey was conducted three months later through convenience sampling to evaluate the efficacy of the implemented strategies.
Results
The survey gathered responses from 55 HCAs (37%), 55 clinical champions (46%) and 61 nurses across different inpatient areas. The results revealed that all nurses and clinical champions agreed that the tiered assessment methodologies effectively empowered and prepared HCAs for the competency. Additionally, 54 of HCAs (98%) expressed confidence in performing the supervised feeding skill.
Conclusion
The incorporation of Miller’s pyramid in the tiered assessment design is beneficial for workplace-based assessment, addressing both cognitive and behavioural aspects of clinical competence. This approach instils confidence in skill performance of support care staff. Future research could explore the integration of technology to enhance accessibility in training and assessment, considering the resource and manpower constraints associated with traditional face-to-face classroom simulated assessment.
References (maximum three)
Poon, M. M., Ward, E. C., & Burns, C. L. (2023). Adult dysphagia services in acute and subacute settings in Singapore, Speech, Language and Hearing. DOI: 10.1080/2050571X.2023.2240988
Ramani, S. & Leinster, S. (2008). AMEE guide no. 34 : Teaching in the clinical environment, AMEE Guide, 30, 347-364. DOI: 10.1080/01421590802061613
SkillsFuture Singapore (2023). Learn about job roles – healthcare assistant / basic care assistant / nursing aide. https://www.myskillsfuture.gov.sg/content/student/en/preu/world-of- work/occupation.html
4:25 pm
Sarintip Thongsiw1
Piyaporn Sirijanchune1
1 Medical Education Center, Chiangrai Prachanukroh hospital
Piyaporn Sirijanchune1
1 Medical Education Center, Chiangrai Prachanukroh hospital
Previously the Objective Structured Clinical Examination (OSCE) was the standard traditional assessment for the clinical performance of medical students which was limited in some aspects of reliability and validity. The alternative methods using the assessment with Assessment of Special Clinical Encounter (ASCE) replaced the OSCE for evaluation of the clinical skill. Both examinations are considered objective assessments of clinical practical skills. OSCE is a common method of assessing clinical competence in a simulated clinical environment consisting of multiple stations to perform specific clinical skills in a limited period. ASCE gathering multiple clinical skills with a mix of clinical problems assessments in each station required more periods of time. This study aims to assess using ASCE methods compared to standard traditional methods using OSCE. A cross-sectional study was conducted from June 2021 to June 2023. Structured questionnaires with brief interviews were used to survey. 103 medical students and 22 examiners participated in the study. From the survey 83% of the medical students were satisfied by the examination structure and organization with 93% of the ability of clinical competence evaluation. The examiner’s report 72% satisfied with the examinations and organization with 84% of the ability of clinical competence evaluation. However, the medical students reported 91% stressful with 74% flexibility of the ASCE. The examiner reported 80% of time from the long period of time in the ASCE. There was a positive perception of the ASCE in various aspects by medical students and the examiner. ASCE shows benefits for the assessment of the clinical performances of medical students. This study indicates that using ASCE for standard assessment of clinical performance is a practical assessment tool for the medical student's clinical competence. These could replace the traditional discipline based clinical examinations. Further development for using the ASCE for evaluation should be warranted.
References (maximum three)
1. Bevan J, Russell B, Marshall B. A new approach to OSCE preparation - PrOSCEs. BMC Med Educ. 2019;19(1):126. https://doi.org/10.1186/s12909-019-1571-5.
2. Grover, S., Pandya, M., Ranasinghe, C. et al. Assessing the utility of virtual OSCE sessions as an educational tool: a national pilot study. BMC Med Educ 22, 178 (2022). https://doi.org/10.1186/s12909-022-03248-3
4:30 pm
Andrew Ming-Liang Ong1,2
Clasandra Hum1, Ching Ming Wan1, Jodie Lee1, Wai Yee Phoon1 and Hak Khoon Tan1
1 Graduate Medical Education Office, Singhealth, Singapore
2 Department of Gastroenterology & Hepatology, Singapore General Hospital
Clasandra Hum1, Ching Ming Wan1, Jodie Lee1, Wai Yee Phoon1 and Hak Khoon Tan1
1 Graduate Medical Education Office, Singhealth, Singapore
2 Department of Gastroenterology & Hepatology, Singapore General Hospital
Abstract
Singaporean postgraduate programs are transitioning to EPAs. Online assessment platform, MedHub, was introduced to accommodate this. We describe our organisational faculty development (FD) strategy & successful implementation.
Summary of work:
Modified Delphi consensus with 17 individuals (7 program directors(PD), 6 program coordinators and 4 vice-chairs). 16 principles chosen to reflect EPA implementation, divided into 5 phases for longitudinal and systematic FD.
Phase 1
-
Period
- Sep2022
-
Aim
- Introduce Programmatic Assessment(PA) & EPAs
-
Attendance
- 100
-
Activities
- (1)Didactic session
- (2)How-to Videos & Resources created in learning management system (LMS)
- (3)EPA implementation guide disseminated
Phase 2(The Right Tool)
-
Period
- Nov2022
-
Aim
- Designing assessment strategy & tools wth MedHub
-
Attendance
- 60
-
Activities
- (1)+(2)+(3)Workshop designing tools
- (1)+(2)+(3)Workshop designing tools
Phase 3(The Right Interpretation)
-
Period
- Jan2023
-
Aim
- Structuring Clinical Competency Committee(CCC) meetings & reports using MedHub
-
Attendance
- 87
-
Activities
- (1)+(2)+(3)Workshop running CCC
- (1)+(2)+(3)Workshop running CCC
Phase 4(The Right Assessors)
-
Period
- Mar2023
-
Aim
- Train assessors
-
Attendance
- 121
-
Activities
- (1)+(2)+(3)Designing guidebook to train faculty
Phase 5(The Right Feedback)
-
Period
- May2023
-
Aim
- Train learners
-
Attendance
- 127
-
Activities
- (1)+(2)+(3)Designing guidebook to train learners
- (1)+(2)+(3)Designing guidebook to train learners
Results/Discussion
11 PDs completed surveys pre-implementation and at 1-year. Each principle marked “not planned/implemented(0)”, “planned(1)” and “implemented(2)”. Moving from 0 to 1/2 or 1 to 2 considered improvement.
Phase | Principle | % of programs with improvement |
1 | Purpose of EPAs within PA made clear | 90.9% |
2 | EPA assesment tool captures entrustment-based decision | 100.0% |
2 | Workflow to trigger EPA assessment | 81.8% |
2 | MedHub collects EPAs AND competencies data | 90.9% |
3 | CCC workflow evaluating EPA AND Competencies using MedHub | 54.5% |
3 | CCC pre-defined triangulation rules of EPA and existing tools | 54.5% |
3 | CCC reports include milestones and EPA entrustment | 54.5% |
4 | Shared mental model (SMM) of each EPA disseminated | 81.8% |
4 | SMM for entrustment concept disseminated | 81.8% |
5 | Workflow sharing assessment data with residents | 54.5% |
5 | Residents trained to improve using EPA data | 45.5% |
5 | Faculty trained to deliver EPA feedback | 81.8% |
5 | Learning activities pegged to EPAs | 81.8% |
5 | EPA Resources available | 72.7% |
5 | EPA Orientation material available Structured EPA FD program | 90.9% |
81.8% |
Conclusions:
We describe a novel 1-year FD program using 16 core EPA implementation principles spanning 5 phases, supported by successful program improvement rates. We will adopt this for future FD within our institution.
References (maximum three)
Carraccio C, Martini A, Van Melle E, Schumacher DJ. Identifying Core Components of EPA Implementation: A Path to Knowing if a Complex Intervention Is Being Implemented as Intended. Acad Med. 2021 Sep 1;96(9):1332-1336. doi: 10.1097/ACM.0000000000004075. PMID: 33769339.
Hennus MP, Jarrett JB, Taylor DR, Ten Cate O. Twelve tips to develop entrustable professional activities. Med Teach. 2023 Jul;45(7):701-707. doi: 10.1080/0142159X.2023.2197137. Epub 2023 Apr 7. PMID: 37027517
Favreau MA, Tewksbury L, Lupi C, Cutrer WB, Jokela JA, Yarris LM; AAMC Core Entrustable Professional Activities for Entering Residency Faculty Development Concept Group. Constructing a Shared Mental Model for Faculty Development for the Core Entrustable Professional Activities for Entering Residency. Acad Med. 2017 Jun;92(6):759-764. doi: 10.1097/ACM.0000000000001511. PMID: 28557935.
4:35 pm
Sreedhar Radhika1
Linda Chang1, Sarah Donohue1, Ananya Gangopadhyaya1, Peggy Woziwodzki Shiels1, Asra Khan1, Laura McKenzie1, and Yoon Soo Park1,
1 University of Illinois College of Medicine
Linda Chang1, Sarah Donohue1, Ananya Gangopadhyaya1, Peggy Woziwodzki Shiels1, Asra Khan1, Laura McKenzie1, and Yoon Soo Park1,
1 University of Illinois College of Medicine
Background
There are few Entrustable Professional Activity (EPA) -based assessments that evaluate the bedside practice of evidence-based medicine (EBM), and fewer “systems” of assessments incorporating EBM knowledge and performance. We created an assessment system, to evaluate medical student readiness to practice key components of EPA7 -“Form Clinical Questions and Retrieve Evidence to Advance Patient Care.”
There are few Entrustable Professional Activity (EPA) -based assessments that evaluate the bedside practice of evidence-based medicine (EBM), and fewer “systems” of assessments incorporating EBM knowledge and performance. We created an assessment system, to evaluate medical student readiness to practice key components of EPA7 -“Form Clinical Questions and Retrieve Evidence to Advance Patient Care.”
Summary of Work
The assessment system created by consensus, consisted of a multiple-choice quiz and performance-based simulation, administered to medical students transitioning to clerkships. The 26-item quiz measured individual knowledge and skills in the Ask, Acquire, Appraise and Advise aspects of EPA 7. Triads of students participated in the simulation which measured team-based ability to appraise information and advice a standardized patient (SP) about a therapy.
Results
298/306 (97%) students participated in the quiz and simulation. 75% of students passed the quiz (mean score 71.5 ∓ 10.73%), with 62% and 66% of students correctly answering the questions in the Appraise and Advise categories.
Correlations between student performance on individual quiz items on appraise and advise categories and team based performance assessment did not show significant associations (Appraise: r = 0.05, P = .546; Advise: r = 0.09, P = .234). Learner and SP satisfaction with ability to convey and understand the information respectively showed strong association (r = 0.70, P < .001).
Discussion
This assessment system demonstrates that student knowledge of how to appraise information and apply it to patients as measured in the quiz did not translate to the intended effect of communicating that information to SP.
Conclusion
Simulations help identify gaps in student teams’ ability to apply EBM to patient care.
Take-home messages / implications for further research or practice
Assessing medical student readiness for appraising information and communicating this with patients using simulation should be considered in assessment of student competence.
References (maximum three)
1. Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, Shaneyfelt T, Haynes RB, Guyatt G, Glasziou P. Core Competencies in Evidence-Based Practice for Health Professionals: Consensus Statement Based on a Systematic Review and Delphi Survey. JAMA Network Open. 2018 June 1;1(2):e180281.
2. Kyriakoulis K, Patelarou A, Laliotis A, Wan AC, Matalliotakis M, Tsiou C, Patelarou E. Educational strategies for teaching evidence-based practice to undergraduate health students: systematic review. J Educ Eval Health Prof. 2016 Sep 22;13:34. doi: 10.3352/jeehp.2016.13.34. PMID: 27649902;
3. Kumaravel B, Stewart C, Ilic D. Development and evaluation of a spiral model of assessing EBM competency using OSCEs in undergraduate medical education. BMC Med Educ. 2021 Apr 10;21(1):204. doi: 10.1186/s12909-021-02650-7. PMID: 33838686;
4:40 pm
Young-Min Kim1,2
Sun Mi Yoo2, Hye Rim Jin2, Eun Ju Kim2, Ji Eun Kim2 and Chang-Jin Choi3,2
1 Department of Emergency Medicine, The Catholic University of Korea College of Medicine
2 START Center for Medical Simulation, The Catholic University of Korea College of Medicine
3 Department of Family Medicine, The Catholic University of Korea College of Medicine
Sun Mi Yoo2, Hye Rim Jin2, Eun Ju Kim2, Ji Eun Kim2 and Chang-Jin Choi3,2
1 Department of Emergency Medicine, The Catholic University of Korea College of Medicine
2 START Center for Medical Simulation, The Catholic University of Korea College of Medicine
3 Department of Family Medicine, The Catholic University of Korea College of Medicine
Background:
Progress testing of knowledge has been widely implemented in medical schools. However, progress OSCE for clinical skills (CS) is relatively new and required more studies.
Progress testing of knowledge has been widely implemented in medical schools. However, progress OSCE for clinical skills (CS) is relatively new and required more studies.
Summary of Work:
We implemented the progress clinical performance examination (PCPX) as part of the summative performance assessment in a longitudinal CS program. Two CPX scenarios (A: clinical reasoning-focused with physical examination, B: counseling-focused without physical examination) were incorporated into the diagnostic OSCE at the end of the 2nd year (PCPX I), 10-station OSCE in the 3rd year (PCPX II), and 10-station OSCE in the 4th year (PCPX III) repeatedly. We compared the scores between the PCPXs and performed correlation analyses between PCPX III and national CS examination (KMLE-CS) scores.
We implemented the progress clinical performance examination (PCPX) as part of the summative performance assessment in a longitudinal CS program. Two CPX scenarios (A: clinical reasoning-focused with physical examination, B: counseling-focused without physical examination) were incorporated into the diagnostic OSCE at the end of the 2nd year (PCPX I), 10-station OSCE in the 3rd year (PCPX II), and 10-station OSCE in the 4th year (PCPX III) repeatedly. We compared the scores between the PCPXs and performed correlation analyses between PCPX III and national CS examination (KMLE-CS) scores.
Results:
Ninety-six student data were analyzed, and 69 students provided their KMLE-CS scores. The CPX A scores were higher in PCPX III than in PCPX I. The correct items of physical examination and communication domains were higher in PCPX III than in PCPX II. The CPX B scores were improved in PCPX II compared to PCPX I but lowered in PCPX III (p<0.001, p=0.001, respectively), and the communication skills mainly declined. The PCPX III score was moderately correlated with the total KMLE-CS score (r=0.356, p=0.003) and SP- based CPX KMLE-CS score (r=0.315, p=0.008).
Ninety-six student data were analyzed, and 69 students provided their KMLE-CS scores. The CPX A scores were higher in PCPX III than in PCPX I. The correct items of physical examination and communication domains were higher in PCPX III than in PCPX II. The CPX B scores were improved in PCPX II compared to PCPX I but lowered in PCPX III (p<0.001, p=0.001, respectively), and the communication skills mainly declined. The PCPX III score was moderately correlated with the total KMLE-CS score (r=0.356, p=0.003) and SP- based CPX KMLE-CS score (r=0.315, p=0.008).
Discussion:
The measurement error could be possible due to the case specificity of the scenarios. Repeated use of the same scenarios could be potential internal validity threats due to history, maturation, and testing bias.
The measurement error could be possible due to the case specificity of the scenarios. Repeated use of the same scenarios could be potential internal validity threats due to history, maturation, and testing bias.
Conclusions:
The PCPX was feasible, and students’ CS progress could be identified. Efforts to improve communication skills in counseling settings during the final-year clerkship are necessary.
The PCPX was feasible, and students’ CS progress could be identified. Efforts to improve communication skills in counseling settings during the final-year clerkship are necessary.
Take-home messages:
As part of the summative performance assessment, the PCPX could help identify the students’ CS progress. Diversifying cases, developing additional scenarios, and adjusting the number of OSCE stations are required.
As part of the summative performance assessment, the PCPX could help identify the students’ CS progress. Diversifying cases, developing additional scenarios, and adjusting the number of OSCE stations are required.
References (maximum three)
1. Gold JG, DeMuth RH, Mavis BE, Wagner DP. Progress Testing 2.0: clinical skills meets necessary science. Med Educ Online 2015;20:27769.
2. DeMuth RH, Gold JG, Mavis BE, Wagner DP. Progress on a New Kind of Progress Test: Assessing Medical Students' Clinical Skills. Acad Med 2018;93:724-8.
2. DeMuth RH, Gold JG, Mavis BE, Wagner DP. Progress on a New Kind of Progress Test: Assessing Medical Students' Clinical Skills. Acad Med 2018;93:724-8.
3. Pugh D, Touchie C, Wood TJ, Humphrey-Murto S. Progress Testing: Is there a role for the OSCE? Med Educ 2014;48:623-31.
4:45 pm
Jeanette Ignacio
Tanushri Roy1
1 National University of Singapore
Tanushri Roy1
1 National University of Singapore
Background
In the traditional nursing curricula, the physical examination course stands alone. It is not linked to the anatomy and physiology course. However, as students transit from the university environment to the clinical setting during postings, there is an expectation that these students have integrated of the various knowledge and skills from their different courses. This does not easily happen as cognitive integration takes time (Ignacio and Chen, 2020). It is thus imperative that the curriculum design helps facilitate cognitive integration in students. As such, an integrated anatomy and physiology, and physical examination course was conceived. This course integrated these three disciplines not only in how they were taught, but also in the design of the course’s assessment/examination.
Objectives
The aims of this study were to determine the effects of integrating anatomy and physiology, and physical examination in teaching and assessments.
Methods
The integrated assessment/examination scores were compared with results from the previous semester when there was no integration. Students’ perceptions on how the integrated course in terms of teaching and assessments facilitated learning were also elicited through focus groups.
Findings
Performance based on assessment results were compared between two cohort of students: those who attended anatomy and physiology, and physical examination as two separate courses, and those who attended these subjects as an integrated course. The comparison shows a significant difference (p- < .00001) in the average scores between the two groups. This aligns with the previous findings that show an improvement in student success following content integration in health science courses (Finn, et al., 2017). Thematic analysis from the focus group discussion generated three themes: (1) forming better conceptual links; (2) learning through the senses and (3) translating into clinical practice.
Conclusion
Integrating disciplines through teaching practices and assessment/examination design is a promising strategy to promote cognitive integration.
References (maximum three)
Finn, K. E., FitzPatrick, K. & Yan, Z. Integrating Lecture and Laboratory in Health Sciences Courses Improves Student Satisfaction and Performance. Journal of College Science Teaching, 47(1), 66-75.
Ignacio, J. &Chen, H.-C. (2020). Cognitive integration in health professions education: Development and implementation of a collaborative learning workshop in an undergraduate nursing program, Nurse Education Today, 90. doi.org/10.1016/j.nedt.2020.104436