Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

OSCE logistics and variants

Oral Presentation

Oral Presentation

11:30 am

28 February 2024

M209

Session Program

Margo Lane1
Ashlee Forster1, Mary Kelleher1, Belinda Swyny1 and Sharee Stedman1 
1 UQ Medical School 



Background
One component of the system of assessment for The University of Queensland’s (UQ) new MD Design curriculum is the Year 1 Objective Structured Clinical Examination (OSCE). This OSCE will be delivered in an open-plan non-clinical teaching space for the large cohort (N=480). Prior research indicates that materials, technologies and spaces used during OSCEs may influence stakeholder performance. (1) The aim of this study is to explore the experiences and perspectives of key stakeholders in this novel Year 1 OSCE, including professional and academic staff, students, simulated participants and examiners. 


Summary of Work
Post-activity survey design methodology has been employed for this study. The online survey was designed by the researchers and comprises a series of statements requiring Likert scale ratings as well as the opportunity for free text responses allowing key stakeholders to provide detailed feedback on their Year 1 OSCE experiences. Descriptive statistics and thematic analysis will be used to analyse the data.


Results
Survey and free text response data will be analysed and presented at Ottawa Conference 2024. 


Discussion
We anticipate that the key stakeholder experiences will vary depending on a range of factors including their role, prior OSCE experiences and personal expectations. These results will inform iterative quality improvement for the UQ Year 1 OSCE and contribute to the body of knowledge regarding open-plan OSCEs for large cohorts. 


Conclusions
We hypothesize that the key factors for successful delivery of a large cohort, open-plan OSCE are excellent communication and collaborative teamwork. All key stakeholders have valuable perspectives and experiences, and these should be considered in the planning and development of future OSCEs. 


Take-home messages / implications for further research or practice 
This study has provided perspectives on a large cohort, open-plan OSCE implementation, leading to opportunities to streamline processes to enhance future iterations of similar OSCEs. 



References (maximum three) 

References 

1. Rees, CE, Ottrey, E, Barton, P, Dix, S, Griffiths, D, Sarkar, M, Brooks, I. Materials matter: Understanding the importance of sociomaterial assemblages for OSCE candidate performance. Medical Education 2021;55:961-971 

Marina Sawdon1
Lucy Ambrose1
1 University of York, Hull York Medical School 




Background
The Objective Structured Long Examination Record (OSLER) assesses clinical skills in an integrated manner (1). It is longer in duration and uses real patients and therefore more closely simulates a real clinical encounter. However, OSLERs may suffer lower reliability than OSCEs due to fewer cases, assessor variability and the use of different patients. 


Summary of work
In preparation for the upcoming national licence assessment in the UK, we aimed to improve the reliability of our 6 station OSLER by clinical examiner training and calibration, incorporating standardised stations using simulated patients while retaining some real patient stations, and providing standardised examiner questions. In addition, we have introduced a conjunctive standard of a minimum level of competence in each clinical competency domain assessed. Candidates are assessed in 5 clinical competency domains in all stations. 


Results
Following the changes described we have seen an increase in the reliability of the OSLER from 0.636 to 0.803. The conjunctive standard will be introduced in the current academic year and will be described in the presentation. 


Discussion
We have seen a 20% increase in reliability after introducing more rigorous clinical examiner training and calibration exercises, incorporating some common, standardised cases across all students, and standardised examiner questions. 


Conclusions
We have demonstrated that the reliability of OSLERs can be improved and further standardised whilst maintaining authenticity. Implementation of the conjunctive standard described is innovative, has face validity, and avoids compensation between competencies. 


Take home message
We have shown that the OSLER can be a reliable method for assessing the clinical competencies required to become a Foundation Doctor, it is authentic, and holistic, and with the addition of the conjunctive method described will increase public confidence that our graduates are all round competent doctors. 

References (maximum three) 

1. Gleeson FA (1997) Assessment of clinical competence using the objective structured long examination record (OSLER). Medical Teacher. 19: 7-14 

Julia Harrison1
Michelle Leech1 and Arunaz Kumar1
1 Monash University 



The Assessment of Physical Examination skills (APEX) assessment tool was developed for final year medical students to ensure graduates were safe to practice in relation to their physical examination (PE) skills. It involved summative assessment of physical examination of a Simulated Patient (SP), in the presence of an assessor (senior doctor), immediate verbal feedback, and repeat attempts if required; [unlike a summative Observed Structured Clinical Exam (OSCE) with no detailed feedback/ opportunity to improve]. The goals were: identification and remediation for students who deemed below standard; assessment for learning; motivation for learning; and reassurance of readiness and safety in professional practice. 

At the end of the academic year, anonymous, optional online evaluation surveys (multiple- choice questions and free text responses) were offered to all assessors and students addressing applicability, confidence levels of PE, value of APEX assessment, feedback provided and impact on PE skills. Descriptive statistics were used to report Likert scale responses, and content analysis was used to analyse free-text responses. SPs were also interviewed for their perspective on the assessment and their responses have been thematically analysed. 

130/507 students responded to the survey. A total of 124 (95%) students reported confidence in physical examination skills, 111 (85%) students appreciated the opportunity for the assessment, and 102(78%) students appreciated the feedback they received. Themes from the content analysis of student transcripts suggested increased motivation to practice PE skills (41), increased confidence (21), opportunity for feedback (15) and positive learning environment (9). 33/53 assessors responded to the survey. 70% of them preferred the new assessment compared to the OSCE, mainly for its perceived greater learning value, and 15% preferred the OSCE for assessment of PE skills. Students, assessors, and SPs reported favourably about its design, particularly in relation to opportunities for feedback, motivation for learning and similarity to real-life clinician-patient discourse. 



James Fraser1
Tarun Sen Gupta1,2, Richard Hayes1,2, Lucie Walters1,3, Eugene Wong1, James Dawber1 and Mattehw Thurecht1
1 Australian College of Rural and Remote Medicine
2 James Cook University
3 University of Adelaide 



The Australian College of Rural and Remote Medicine (ACRRM) was established in 1999 and is one of two colleges that provide General Practice training in Australia. ACRRM Fellows complete a four year rural generalist training program that equips them with the knowledge, primary care and advanced skills to provide patient care in rural and remote Australia. Graduates are Rural Generalists, who are formally recognised as specialist practitioners providing a combination of extended general practice and additional expertise in a subspecialty that rural communities need but cannot support in a full-time sub-specialist roster. 

The ACRRM Fellowship assessment program has a programmatic design that is wholly delivered remotely. The assessment program utilises a variety of modalities including Multiple Choice Questions, Multi Source Feedback, mini-CEX, Case Based Discussions and a procedural logbook. ACRRM pioneered the development of a remotely-delivered, multistation clinical assessment in 2008 and have run this once or twice yearly since (Sen Gupta et al, 2021). This assessment method, known as Structured Assessment of Multiple Patient Scenarios (StAMPS), consists of 8 ‘stations’ where candidates explore standardised clinical scenarios and respond to a number of questions by a sole examiner, with limited prompting. These scenarios are set in a standardized hypothetical location, “Stampsville” which has a defined community profile. The content of the assessments is developed by practicing rural generalists and focusses on assessing clinical practice in authentic rural environments. 

This presentation will discuss the utility of this unique remotely delivered assessment. 



References (maximum three) 

Sen Gupta T, Campbell D, Chater A, Rosenthal D, Saul L, Connaughton K, Cowie M. Fellowship of the Australian College of Rural & Remote Medicine (FACRRM) Assessment: a review of the first 12 years MedEdPublish https://doi.org/10.15694/mep.2020.000100.1 (2020) 

Sen Gupta, TK, Wong E, Doshi D, Hays RB. "'Stability' of Assessment: Extending the Utility Equation." MedEdPublish 10 (2021).