Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

How can I enhance equivalence in my OSCE? Opportunities and challenges of new innovations to support or equate for examiner variability.

Oral Presentation
Edit Your Submission
Edit

Oral Presentation

10:15 am

28 February 2024

M204

Technical matters in OSCEs

Presentation Description

Peter Yeates1
Becky Edwards1 and Richard Hays2
1 Keele university
2 JCU Murtupuni Centre for Rural & Remote Health




Background
Equivalence can be defined as the tendency for a given student to reach the same outcome in an assessment regardless of where or when they are examined. Ensuring equivalence in high stakes performance assessments (for example OSCEs or standardized patient exams) is vital for patient safety and candidate fairness, as well as a growing focus of regulators. Equivalence is important both within institutions who run large or distributed exams, as well as between institutions nationally. 


Why is the topic important for research and / or practice? 
Several countries or regions internationally have announced or are moving towards national licensing exams, or share OSCE stations to aid alignment. To test clinical skills or performance, this often requires that candidates are examined across several locations, or at multiple times with different groups of examiners. Ensuring alignment of the judgements made by different groups of examiners (at separate times or in different locations) is critical to the fairness of large-scale exams but has traditionally been challenging both to investigate and to support. Recent innovations in this space (Video-based examiner score comparison and adjustment (VESCA)(1), and Video-based benchmarking (VBB)(2)) offer novel potential approaches to calibrate, compare or even equate for examiner differences which can supplement traditional approaches to faculty development. 


Workshop format, including participant engagement methods 
Workshop facilitators will invite participants to share their experiences, opinions and recommendations for enhancing OSCE equivalence. Building from current best-practice recommendations for enhancing graduation level performance assessment(3), facilitators will use case studies to illustrate the practical use of two approaches to enhancing equivalence: 

  1. examiner calibration (VBB) and 

  2. score adjustment (VESCA). 

Using data from recently completed and ongoing research, facilitators will illustrate the extent and impact of examiner variability across locations within an OSCE, and its implications for equivalence. They will present data which informs the opportunities and challenges in using these relatively novel approaches to either calibrate examiners or equate for their differences, including data on the accuracy and effectiveness of these approaches, and data which describe how participants will interact, use and trust each approach. 

Working in groups, participants will critically reflect on the relative merits of trying to calibrate examiners versus adjusting for their differences. Participants will work together to produce actionable plans for enhancing their own OSCEs. 


Who should participate? 
People involved in or interested in OSCEs, particularly when run across multiple parallel tracks or different locations. 


Level of workshop 
All


Workshop outcomes:

By the end of the workshop, participants will be able to: 

  • Describe the importance and challenges of equivalence in distributed OSCEs. 
  • Understand the roles and implications of novel approaches to supporting examiner equivalence. 
  • Critically reflect on the opportunities and challenges of these approaches in relation to their own situation. 
  • Plan potential methods to investigate or support equivalence in their own OSCE settings.


References (maximum three) 

  1. Yeates P, Moult A, Cope N, McCray G, Xilas E, Lovelock T, et al. Measuring the Effect of Examiner Variability in a Multiple-Circuit Objective Structured Clinical Examination (OSCE). Academic Medicine. 2021; 96(8):1189–96. 

  2. Edwards R, Yeates P, Lefroy J, McKinley R. Addressing OSCE Examiner Variability: A Video-Based Benchmarking Approach. In: Association for Medical Education in Europe Annual Conference. 2021. p. 1.1.2: 8682. 

  3. Malau-Aduli BS, Hays RB, D’Souza K, Saad SL, Rienits H, Celenza A, et al. Twelve tips for improving the quality of assessor judgements in senior medical student clinical assessments. Med Teach. 2023: (26):1–5. 

Speakers