Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

Virtual, dispersed and disrupted OSCEs

Oral Presentation

Oral Presentation

2:00 pm

27 February 2024

M203

Session Program

Salma Satchu1
Karen Fung, Mahmoud Suleiman1, Yuen Chu1 and John Pugsley1
1 Pharmacy Examining Board of Canada



The Pharmacy Examining Board of Canada (PEBC) typically assesses about 3000 pharmacy practitioner candidates each year over four separate OSCE sittings and has been successful in maintaining its in-person OSCEs through the pandemic. Recognizing the ongoing instability in OSCE delivery and residual impacts from the pandemic, PEBC sought to evaluate a virtual format of the OSCE to assess entry-to-practice competence of pharmacist candidates and the feasibility of this testing format within its certification program. Prior research on virtual OSCEs has been mostly limited to the academic setting and only a few studies have been conducted in the context of high-stakes credentialing examinations. 

The not-for-credit Virtual Performance Examination pilot assessed 92 volunteer pharmacist candidates in a fully virtual and remote-format exam using an external exam platform that was selected through a formal request-for-proposal process. Forty assessors and 45 Simulated Participants (SPs) were recruited for a single-day delivery in June 2023. All candidates were first-time test takers in the actual for-credit OSCE delivered one month prior to the pilot. The pilot exam consisted of 9 interactive and 7 rest stations, with 4 tracks running simultaneously and 2 sessions being administered through the day. A Steering Committee and an Implementation Committee were formed to plan, guide, and execute the pilot exam. The committees worked closely with the platform vendor for all preparations and the actual delivery of the exam. 

In the presentation, considerations and adaptations made to various aspects of the OSCE for the virtual transition will be discussed. These include changes to the administration and scoring processes, station content, SP and Assessor training, and participant and station materials. Data analyses are currently in progress and are expected to be completed by Fall 2023. Results on the comparability of participant experience and station performance between the virtual and in-person formats will be shared. 



References (maximum three) 

1. Tavares, W., Dichter, R., Leung, Y. C., & Huiskamp, M. (2020). A pandemic means rethinking performance-based assessments. Medical Education, 1-2. 

Clare Heal1
Jane Smith2, Leanne Hall1, Karen D'Souza3 and Karina Jones
1 JCU
2 ACCLAIM, Bond University
3 School of Medicine Deakin University




Background:
Objective Structured Clinical Examinations (OSCE) are used to assess clinical skills(1). We investigated how exit OSCEs changed in Australian medical schools in response to the COVID-19 pandemic. 


Summary of Work:
The 12 eligible Australian medical school members of the Australian Collaboration for Clinical Assessment in Medicine (ACCLAiM)(2) received a 45-item semi- structured online questionnaire. 


Results: 
All schools (12/12) responded. Exit OSCEs were not used by one school in 2019, and 3/11 schools in 2020. Of eight remaining schools, four reduced station numbers and testing time(3). The minimum OSCE testing time decreased from 80 min in 2019 to 54 min in 2020. Other modifications included: a completely online ‘e-OSCE’ (n=1); hybrid delivery (n=4); stations using: videos of patient encounters (n=4), telephone calls (n=2) skill completion without face-to-face patient encounters (n=2). The proportion of stations involving physical examination reduced from 33% to 17%. Fewer examiners were required, and university faculty staff formed a higher proportion of examiners. 


Discussion:
All schools changed their OSCEs in 2020 in response to COVID-19(3). Modifications varied from reducing station numbers and changing delivery methods to removing OSCE and complete assessment re-structuring. Several innovative methods of OSCE delivery were implemented to preserve OSCE validity and reliability whilst balancing feasibility. 


Conclusions:
Opportunities and challenges resulted in innovative modifications to OSCE delivery and streamlining of resources in terms of examiners and simulated patients. These findings may be generalisable to other medical and health professional training institutions responsible for delivering OSCEs, both within Australia and internationally. 


Implications:
Most schools that implemented changes to OSCEs as a result of the pandemic (64%) reported a desire to retain some modifications for future assessments. Further research is needed to explore the reasoning behind retention of COVID-19 modifications in a post- COVID environment as well as the consequences of these changes on all stakeholders. 



References (maximum three) 

(1)Khan KZ, Ramachandran S, Gaunt K, Pushkar P. 2013. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: an historical and theoretical perspective. Med Teach. 35(9):e1437-1446. 

(2)Malau-Aduli BS, Teague PA, Turner R, Holman B, D'Souza K, Garne D, Heal C, Heggarty P, Hudson JN, Wilson IG et al. 2016. Improving assessment practice through cross-institutional collaboration: An exercise on the use of OSCEs. Med Teach. 38(3):263-271. 

(3)Heal C, D'Souza K, Banks J, Malau-Aduli BS, Turner R, Smith J, Bray E, Shires L, Wilson I. 2019. A snapshot of current Objective Structured Clinical Examination (OSCE) practice at Australian medical schools. Med Teach. 41(4):441-447. eng. 

Nadja Mattmann1
Jill Voegelin1 and Slavko Rogan1
1 Bern University of Applied Sciences, Department of Health Professions, Division of Physiotherapy 



Background
Objective Structured Clinical Examination (OSCE) is an assessment tool used in health education and medicine to evaluate clinical skills and competency. During COVID-19 lock- down, not only distance has been offered in health professional education schools, but alternative methods of OSCE have also been considered. An approach that was used was the virtual OSCE. 


Summary of work
Perhaps most importantly, this Covid-19 pandemic poses questions about the value of a university education. In order to stay relevant, universities need to reinvent their learning environments so that digitization becomes part of it. Therefore, the purpose of this study was to summarize current experience of students’, examiners and standardized patients’ satisfaction, students’ performance, and feasibility (time and logistics) of a virtual OSCE. 


Methods
The Spider Tool (S: Sample, P: Phenomena of interest, D: Design, E: Evaluation, R: Research Type) was used to formulate a research question.Two investigators independently and systematically searched on the database PubMed. Titles, abstracts and full texts are screened based on the eligibility criteria. Studies with qualitative, quantitative and mixed-methods designs were included. The main outcomes were ‘satisfaction’, ‘performance’ and ‘feasibility’ of a virtual OSCE. 


Results
The review included 14 articles. Key issues identified for ‘satisfaction’ 1. preparation, 2. adequate content, 3. preparation for real life and 4. comparison to standard OSCE, for ‘feasibility’ 1. planning time, 2. technology and 3. Costs and ‘performance’ was measured using data from the regular OSCE from the previous year. 


Conclusion
This study concludes that positive feedback from both faculty and participants highlights the potential of this method to enhance distance learning and assessment in the field. Overall, all participants were satisfied with efforts, organization, and delivery of a virtual OSCE. There were no concerns about the use of innovative technology in the assessment of a virtual OSCE. 



References (maximum three) 

Harden, R. M., Stevenson, M., Downie, W. W., & Wilson, G. M. (1975). Assessment of clinical competence using objective structured examination. BMJ, 1(5955), 447–451. https://doi.org/10.1136/bmj.1.5955.447 

Kharaba, Z., AlAhmad, M. M., Ahmed Elnour, A., Abou Hajal, A., Abumweis, S., & Ghattas, M. A. (2023). Are we ready yet for digital transformation? Virtual versus on-campus OSCE as assessment tools in pharmacy education. A randomized controlled head-to-head comparative assessment. Saudi Pharmaceutical Journal: SPJ: The Official Publication of the Saudi Pharmaceutical Society, 31(3), 359–369. https://doi.org/10.1016/j.jsps.2023.01.004 

Lebdai, S., Bouvard, B., Martin, L., Annweiler, C., Lerolle, N., & Rineau, E. (2023). Objective structured clinical examination versus traditional written examinations: A prospective observational study. BMC Medical Education, 23, 69. https://doi.org/10.1186/s12909-023- 04050-5 

Peter Yeates1
Adriano Maluf2, Natalie Cope1, Gareth McCray1, Kathy Cullen3, Vikki O'Neill3, Rhian Goodfellow4, Rebecca Vallander4, Ching-wa Chung5 and Richard Fuller6
1 Keele university
2 de Montford University
3 Queens University Belfast
4 Cardiff University
5 University of Aberdeen
6 Christie Hospitals NHS Foundation Trust




Introduction
Ensuring inter-institutional equivalence of graduation-level OSCE decisions is critical to fairness and patient safety, however methodological challenges mean this is rarely studied. Recently, an innovation called video-based examiner score comparison and adjustment (VESCA)(1) has enabled linked comparison of examiners within distributed OSCE. Since prior research has hinted at potentially substantial inter-institutional differences(2), we used VESCA to determine the equivalence of different parallel groups (“examiner-cohorts”) within and between UK medical schools, and the impact of adjusting for any differences on students’ pass rate. 


Methods
We ran the same 6-station formative OSCE at four UK medical schools(3). After examining live performances, examiners additionally scored three station-specific comparison videos which provided 1/ controlled comparison of examiners’ scoring between schools and 2/ data linkage within a linear mixed model. Impact of adjusting for examiner variations on students’ pass/fail and rank were calculated. 


Results
Controlled comparison of examiners’ scores differed between schools by up to 16.3% from 16.52 (95%CIs 15.52-17.52) out of 27 to 19.96 (95%Cis 18.94-20.97) out of 27, p< 0.001. Examiner-cohorts varied more between schools than within schools (16.3% vs 8.8%). Students’ unadjusted scores suggested inter-school variation in students’ performances of up to 10.8% (17.65(16.87-18.43) to 19.91(19.13-20.69),p<0.001), which was no longer present after adjusting for examiner differences (18.38(17.25-19.52) to 19.14((18.19-20.10), 3.62% difference, p=0.69), thereby suggesting the apparent difference was attributable to examiner, rather than student, variation. Failure rates varied between schools and were substantially 

altered by score adjustment (e.g. school 2: observed score failure rate=39.1%; adjusted failure rate=8.7%; school 4 observed=0.0%, adjusted=21.7%). 

Discussion and Conclusions: 
We found substantial inter-institutional differences in examiner stringency which would challenge the equivalence of outcomes if replicated within a summative setting. These apparent variations in graduation-level expectations warrant prospective investigation in summative settings to safeguard equivalence nationally. VESCA offers a feasible method to perform these comparisons. 



References (maximum three) 

1. Yeates P, Moult A, Cope N, McCray G, Xilas E, Lovelock T, et al. Measuring the Effect of Examiner Variability in a Multiple-Circuit Objective Structured Clinical Examination (OSCE). Academic Medicine. 2021;96(8):1189–96. 

2.Sebok SS, Roy M, Klinger D a, De Champlain AF. Examiners and content and site: Oh My! A national organization’s investigation of score variation in large-scale performance assessments. Adv Health Sci Educ 2015;20(3):581–94. 

3. Peter Yeates, Adriano Maluf, Ruth Kinston, Natalie Cope, Gareth McCray, Kathy Cullen, et al. Enhancing Authenticity, Diagnosticity and Equivalence (AD-Equiv) in multi-centre OSCE exams in Health Professionals Education. Protocol for a Complex Intervention Study. BMJ Open. 2022;12:e064387. doi: 10.1136/bmjopen-2022-064387 

Pavla Simerska Taylor1
Bunmi Malau-Aduli2, Karen D'Souza3, Jemma Skeat4, Melissa Wos-Oxley4, Jane Smith5, Elina Ng6 and Robyn Stevenson7
1 Griffith University
2 University of New England and the University of Newcastle
3 School of Medicine Deakin University
4 Deakin University
5 ACCLAIM, Bond University
6 Curtin Medical School
7 James Cook University




Background
Assessments including the Objective Structured Clinical Examinations (OSCEs) have been impacted by the COVID-19 pandemic. (1) Implementing a costly OSCEs is challenging, however, the increased use of artificial intelligence (AI) in written assessments, emphasises the value of practical assessments. There is a need to consider what an ‘OSCE’ looks like in 2024, and to re-examine its value and best practice. 


Summary of work
This study was conducted under the auspices of the Australasian Collaboration for Clinical Assessment in Medicine (ACCLAiM) which develops and benchmarks OSCEs across medical schools in Australia and New Zealand. Stakeholders’ perceptions of the continued relevance and significance of OSCEs in post-COVID and AI world were evaluated by using quantitative and qualitative data methods adding to previous ACCLAiM research findings. (2, 3) 


Results
The results indicated differences (e.g. station numbers, timing, feedback, reassessment, remediation, student progression) in how OSCEs are delivered across medical schools which could affect their validity. The sustainability of this model of assessment was discussed by participants, and the major themes arising are presented. 


Discussion
OSCEs are still a highly relevant form of clinical assessment because they enable the evaluation of a diverse range of skill sets. However, a quality assurance approach to OSCEs is recommended to ensure equitable standards in OSCE assessment across medical schools. More research is needed to understand the impact of current assessment practices on learning. 


Conclusions
The results from this study show the current state of OSCEs across medical schools, including areas of best practice and areas that would benefit from more collaborative research to improve practice. 


Take-home messages / implications for further research or practice 
The OSCE continues to play an important role in assessment with relevance and value in an ever-evolving educational landscape, especially when AI can compromise the reliability of written examinations. 



References (maximum three) 

1. Malau-Aduli BS, Jones K, Saad S, Richmond C. Has the OSCE Met Its Final Demise? Rebalancing Clinical Assessment Approaches in the Peri-Pandemic World. Front Med (Lausanne). 2022;9:825502. 

2. Heal C, D’Souza K, Banks J, Malau-Aduli BS, Turner R, Smith J, et al. A snapshot of current Objective Structured Clinical Examination (OSCE) practice at Australian medical schools. Medical Teacher. 2019;41(4):441-7. 

3. Heal C, D’Souza K, Hall L, Smith J, Jones K. Changes to objective structured clinical examinations (OSCE) at Australian medical schools in response to the COVID-19 pandemic. Medical Teacher. 2022;44(4):418-24.