Presentation Description
Matt Homer1
1 University of Leeds, School of Medicine
1 University of Leeds, School of Medicine
Background and context
In many high-stakes settings, there is an internet-based industry supporting candidates to succeed in OSCE-type assessments. There is a natural concern that some exam material might then, over time, become available to prospective candidates, and this might impact systematically on station and exam difficulty. This study investigates whether there is any evidence that this is happening in practice in the PLAB2 exam in the UK, for international graduates who want to work in the UK NHS, There is only limited, and partially relevant, literature on this issue (e.g. McKinley and Boulet 2004; Baig and Violato 2012).
Data and methods
The quantitative analysis models variation in station-level facility (n>18000 station-level observations across >750 different stations over the period 2016 to 2023) and controls for:
- different underlying station facility and examiner stringency (via random intercepts)
- date (to see if there is an overall trend upwards or downward in facility over time), and,
- the number of times a station has been used up to that point.
Results
27% of variance in facility is due to station, and 21% to examiner, once other factors are accounted for. Date has a very small negative effect on facility indicating an overall decline of around 1% a year in overall facility across all stations. Finally, there is some small evidence of a slight increase in station facility each time a station is used – every 100 times a station is used its facility increases by an average of 11%.
Discussion/Conclusion/Take home messages
This work suggests that 'station leakage' over time is likely quite minimal in terms of having an impact on the difficulty of stations based on how often they have been used. However, policy might be enhanced and the public better reassured via the retiring of stations at a certain level of re-usage.
References (maximum three)
Baig LA, Violato C. 2012. Temporal stability of objective structured clinical exams: a longitudinal study employing item response theory. BMC Medical Education. 12(1):121. https://doi.org/10.1186/1472-6920-12-121
McKinley DW, Boulet JR. 2004. Detecting Score Drift in a High-Stakes Performance-Based Assessment. Adv Health Sci Educ Theory Pract. 9(1):29–38. https://doi.org/10.1023/B:AHSE.0000012214.40340.03