Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

Approaches to OSCE

Oral Presentation

Oral Presentation

1:30 pm

26 February 2024

M211

Session Program

Louise Curley1
Emma Batey1, Melanie Begovic1, John Egan1 and Sachin Thakur1 
1 The University of Auckland 



Background
Objective Structured Clinical Examinations (OSCEs) are recognised to be a source of anxiety and stress (1). Mock OSCEs reduce levels of anxiety and improve confidence (1), however, create significant demands on resources, including time, cost, staffing and preparation. Recent evidence has indicated that use of technology could also help students prepare for OSCEs (2). 


Summary of work & Results 
We recognised an opportunity to design and build an online learning experience to aid with preparation for OSCEs. The design of this ‘OSCE toolkit’ took into consideration core principles of e-learning, e.g., using Bates and Poole’s SECTIONS framework and the seminal work by Anderson (3). We developed an interactive book in H5P for students to access. The interactive resource encompassed 3 core sections: 1) Background information; 2) Course- specific OSCE logistics and 3) Stations. Section three contained eight key features plus interactive elements for students to engage with, for each station. Features included videos role playing the station and debriefing discussions. Questionnaires were completed by students to investigate their perceptions of resources to aid with OSCE preparation and the specific features of the toolkit that they found useful. Overall, positive responses emerged, and suggestions for further improvements. 


Discussion 
The OSCE toolkit has the potential to be a useful online resource to help prepare students for their OSCE. Care needs to be taken that the resource meets best-practice guidance, has individual feedback to the student and is authentic. The use of technology to help prepare students for OSCEs, is in line with research by Flood and colleagues (2), who reported that video cases support students in preparation for high stakes OSCEs. 


Conclusion 
Development of a comprehensive online resource toolkit could be an alternative method of equipping students for OSCEs, however, these should be systematically designed to ensure they are integrated and meaningful. 



References (maximum three) 

1. Robinson P, Morton L, Haran H, Manton R. Mock OSCEs improve medical Students' confidence and reduce anxiety related to summative examinations. Education in Medicine Journal. 2017 Apr 1;9(2):41-5. 

2. Flood M, Strawbridge J, Sheachnasaigh EN, Ryan T, Sahm LJ, Fleming A, Barlow JW. Supporting pharmacy students' preparation for an entry-to-practice OSCE using video cases. Currents in Pharmacy Teaching and Learning. 2022 Dec 1;14(12):1525-34. 

3. Anderson T. Towards a theory of online learning. Theory and practice of online learning. 2004;2:109-19. 

Michael Poulton 


The integration of near peer assessors (NPAs) in Objective Structured Clinical Examinations (OSCEs) has gained prominence within medical education. Existing literature underscores NPAs' potential to enhance formative assessment processes by providing relatable insights, quality feedback to peers, and cultivating critical assessment skills (1,2). This project investigated the perceptions of faculty examiners, Year 3 medical students, and Year 4 near peer assessors regarding the integration of NPAs in formative OSCEs. 

Faculty examiners (n=8), Year 3 medical students as peer assessed participants (n=29), and Year 4 students as NPAs (n=18) from a single Victorian clinical school were engaged in a seven station formative OSCE. An individualised written survey was provided to each group of participants at the end of the OSCE. Responses to statements were given via a five point Likert scale and free text responses were collected. 

Initial survey results demonstrated favourable opinions among faculty examiners and Year 3 students regarding the involvement of NPAs in formative OSCEs. Both groups recognized advantages such as heightened student engagement, enriched peer-based learning in a supportive environment and the ability of NPAs to provide valuable insights and constructive feedback. Faculty examiners agreed that maintaining consistency, standardisation and objectivity among near peer assessors may be challenging; a viewpoint not shared by the majority of the Year 4 NPA cohort. Free text responses suggested that Year 4 NPAs valued the experience for the development and refinement of their own clinical skills and assessment literacy. The role of additional training and support for near peer assessors to fulfil the role requirements effectively was consistently highlighted by all participants. 

The results of this project contribute to the growing body of evidence reporting on the benefits, challenges, and effectiveness of utilizing NPAs as contributors to the assessment process, and lends support to their routine use in formative OSCEs. 



References (maximum three) 

1. Khan, R., Payne, M. W., & Chahine, S. (2017). Peer assessment in the Objective Structured Clinical Examination: A scoping review. Medical Teacher, 39(7), 745–756. https://doi.org/10.1080/0142159x.2017.1309375 

2. Schwill, S., Fahrbach-Veeser, J., Moeltner, A., Eicher, C., Kurczyk, S., Pfisterer, D., Szecsenyi, J., & Loukanova, S. (2020). Peers as OSCE assessors for Junior Medical Students 

– a review of routine use: A mixed methods study. BMC Medical Education, 20(1). https://doi.org/10.1186/s12909-019-1898-y 

Claire Palermo1
Sue Kleve and Zoe Davidson
1 Monash University



Background:
Oral interviews have been criticised for their validity and reliability. Behaviour- based interviews, developed by human resources, aim to examine essential competencies of practice. Their underlying premise states that the way in which a person handled situations in their former experiences is likely the way in which they will handle them in their future experiences(1). While behaviour-based interviews have been reported as a successful selection strategy (2), their utility in competency-based assessment have not yet been described. 


Method:
Behaviour-based interviewers determine ahead of time the behaviours that are essential to perform the required practice (i.e., competencies), and then they develop questions to explore, and scoring systems to assess, the performance behaviours. In this study, interview questions were developed based the National Competency Standards for Dietitians in Australia. A key competency area was selected with relevant performance indicators for examination. The interview consisted of lead-in question followed by several prompting questions based on recent placement experience. The rubric consistent of three-point scale (yes, no, n/a) as to whether the assessor believed the student would be independently capable of the performance criteria for each criterion. The interview was implemented for students as part of programmatic assessment at the completion of a practical placement over three years. 


Results:
Twelve assessors were trained in the use of the interview technique and implemented the assessment over 2020-2022 period. 216 students completed the assessment interview (93% female) with a mean age 23.9 years (range 21-55). The median grade for the interview was 73% across all three years (range 60 to 85). 


Discussion:
Behaviour-based interviewers potentially offer a new approach to assess competence. Other approaches that reflect performance in context and outcomes of performance and reflect the reality of practice including team work, should be considered as the future for competency-based assessment. 



References (maximum three) 

  1. Green, P., Alter, P., & Carr, A. (1993). Development of Standard Anchors for Scoring Generic Past-Behaviour Questions in Structured Interviews. International Journal of Selection and Assessment, 1, 203-212 

  2. Altmaier, E., Smith, W., O'Halloran, C., & Franken Jr, E. (1992). The predictive utility of behavior-based interviewing compared with traditional interviewing in the selection of radiology residents. Investigative radiology, 27(5), 385-389. 

Bradley Williams1
Renee Harvey1, Lizzi Shires1, Nara Jones1, Anthea Dallas1 and Rohan Church1
1 University of Tasmania



Background 
Medical education providers must ensure they deliver fair and defensible quality-assured assessment processes.[1] The Objective Structured Clinical Examination (OSCE) is a complex assessment, requiring a highly coordinated group of clinicians, simulated patients (SPs) and administrative staff, geographically dispersed across testing centres. 


Summary of work 
Placing quality assurance (QA) at the centre, our OSCE has been refined through the innovative and creative use of technology to deliver a fair assessment, acceptable to all participants. 

Briefing:
Examiners are provided station material online with calibrated video examples which they mark and receive immediate feedback from experienced markers. SPs receive their material in the same format. An online briefing session is offered prior to the assessment for examiners and SPs to meet and clarify their roles. 

Live QA:
QA examiners monitor station performance remotely via livestream through a laptop strategically placed in each station and record their observations using an electronic QA marksheet. 

Results:
Marking is collected electronically using online examination software and results moderated using a custom designed spreadsheet template allowing timely release of results. 

Evaluation: All examiners and SPs are invited to participate in an online survey, composed of quantitative (Likert scale) and qualitative (free text), questions to provide feedback on their experience of the QA process. 


Results 
We have collected several years of survey data and will present an analysis of the 2023 results, anticipating approximately 30 SPs, and 30 examiners to respond. 

Descriptive statistics and thematic analysis of free text responses will be reported, related to perceived efficacy of examiner and SP calibration processes. 


Discussion & Conclusions 
Innovative use of online delivery allows QA processes to be conducted remotely in a way that is acceptable to participants and ensures comparable OSCE delivery across sites. 


Take-home messages 
Sharing this approach to remote QA may inspire innovation at other educational institutions. 



References (maximum three) 

1. Malau-Aduli B, Hays R, Van Der Vleuten C. Understanding Assessment in Medical Education Through Quality Assurance. 1 ed: McGraw Hill; 2021 Aug 27. 

Angelina Lim
Daniel Malone, Sunanthiny Krishnan, Simon Furletti and Mahbub Sarkar 



Background:
Although a thoughtfully designed Objective Structured Clinical Examination (OSCE) is a robust and valid assessment tool, debate exists about the effectiveness of OSCEs to authentically mirror real-life scenarios. Limited studies have explored the extrapolation inference in Kane Validity’s Framework.(1) 


Summary of work:
A sequential mixed methods approach was used. Mystery shoppers visited pharmacy students on their community pharmacy placement and simulated the same case scenario students were given a recent infectious diseases OSCE. Students were marked with the same rubrics and these marks were compared with their OSCE score. The mystery shopping visit was then revealed to all the students and all students were asked to participate in a semi- structured interview. 


Results:
Overall, 92 mystery shopper (Work Based Assessment (WBA)) visits with students were conducted and 36 follow-up interviews were completed. The median WBA score was 39.2% lower compared to the OSCE score (p < 0.001). Interviews revealed students knew they did not perform as well in the WBA compared to their OSCE, but reflected that they still need OSCEs to prepare them to manage a real-life patient. 


Discussion:
Many students related their performance to how they perceived their role in OSCEs versus WBAs, and that OSCEs allowed them more autonomy to manage the patient as oppose to an unfamiliar workplace. 


Conclusions:
OSCEs performance did not correlate to real life performance in this instance; however, students still valued having an OSCE before a WBA. 


Take-home messages:
Students scored lower on placement than in OSCEs even though they reflected that it is easier to manage a patient in real-life; outlining challenges to replicate a real- life pharmacy situation in an OSCE, and challenges to testing extrapolation. Whilst OSCEs are useful for testing process type skills, clinical problem solving may be best assessed in a workplace environment. 



References (maximum three) 

Kane, M. T. (1992). An argument-based approach to validity. Psychol Bull, 112, 527-535. 

Gabriel Lau1
Barry Soans1, Mark Philips1, Anne Rogan1, Brendan Grabau1 and Nicole Groombridge1
1 RANZCR



The Royal Australian and New Zealand College of Radiologists (RANZCR) commissioned a full-scale assessment review in 2015. One finding was that the Viva Fellowship Examination was no longer fit-for-purpose. There was a lack of standardisation in the format, no formal standard setting, was prone to examiner variation and potential bias, and was somewhat contrived and not reflective of current clinical practice. 

Assessment working groups were established to re-conceptualise the assessment of clinical competence in radiology, and a new Objective Structured Clinical Examination in Radiology (OSCER) was designed. After much planning and development (and delays due to the pandemic), the first OSCER was administered in June 2023. 

The new Objective Structured Clinical Examination in Radiology enhances reliability by having lots of stations and data points. It is highly standardised through careful blueprinting, case selection, specific questions, and clear marking guides with rubrics. The same cases are now shown for all candidates on the same day. The potential for examiner variation and bias is minimised, and examiners are routinely trained and engage in multiple calibration processes. The oral interaction format still allows for deep probing of knowledge, understanding and higher-order thinking with real-time clarification. Borderline Regression standard setting has also been implemented. And most importantly, the examination is now less contrived, and more reflective of clinical practice. 

The re-design from Viva to OSCER was part of a “whole of assessment program” review, to ensure the value and purpose of all assessments and examinations in the training program. The OSCER is a way of utilising OSCE format design criteria, but then thinking about how to best use the format to assessment clinical competence in radiology training. The OSCERs are now RANZCR’s capstone assessment of clinical reasoning, thinking and the integration of the relevant knowledge and ability of a clinical radiologist. 


References (maximum three) 

RANZCR OSCER 

Lydia Timms1
Elizabeth Hill1 and Margo Brewer2,1
1 Curtin University
2 ANZAHPE




Assessment of clinical competence is essential to ensure speech pathology graduates are practice-ready. Accreditation requirements and assessment pedagogy have required universities to shift from a reliance on clinical educator reports of placement competence to include additional standardised assessment of competence. One standardised assessment is an Objective Structured Clinical Examination (OSCE), which some view as the gold standard of clinical assessments. Student experience significantly impacts their engagement in academic learning and wellbeing so this study reports on the second iteration of an OSCE following implementation of student feedback on a pilot OSCE (the first OSCE in an Australia post- graduate speech pathology course to assess graduate ready competency). 

A mixed method approach was adopted. An anonymous online survey was completed by 15 speech pathology students (45% of cohort) at the end of their course and 11 examiners (95% of cohort). Participants rated a series of statements designed to capture their experience with the assessment on a 5-point Likert scale . They were asked to respond qualitatively to two questions to further explore their experience. 

Despite high levels of anxiety, most students reported understanding the purpose of the OSCE, embraced the assessment as a learning opportunity, and could see the value of this assessment in their course. However, students did not see the OSCEs as reflecting real life practice and were equivocal about the fairness of this assessment type. Their qualitative responses reflected similar sentiments with expansion on points related to a) authenticity, b) emotions/experience and c) learning and logistics. 

Examiners, who were all new to marking OSCEs, provided positive ratings for all items, excluding statements related to clarity of marking guides. 

This assessment type remains embedded in the course with updates to examiner training, student preparedness and the authenticity of the simulated client and clinical scenario. 




References (maximum three) 

Hill, E., Timms, L., Brewer, M., (under final review). A pilot study of Objective Structured Clinical Examinations in a postgraduate speech pathology program. Journal of Clinical Practice in Speech-Language Pathology. 

Quigley, D., & Regan, J. (2021). Introduction of the objective structured clinical examination in speech and language therapy education: student perspectives. Folia Phoniatrica et Logopaedica, 73(4), 316-325. 

Yap, K., Sheen, J., Nedeljkovic, M., Milne, L., Lawrence, K., & Hay, M. (2021). Assessing clinical competencies using the objective structured clinical examination (OSCE) in psychology training. Clinical Psychologist, 25(3), 260-270.