Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

Clinically focused assessment

Prep

PREP

10:30 am

26 February 2024

M205

Session Program

Khaled Almisnid1
Matt Homer2, Trudie Roberts3 and Rikki Goddard-Fuller4
1 Medical Education Unit, Unaizah College of Medicine, Qassim University, KSA. 2 University of Leeds
3 Emeritus Professor of Medical Education, University of Leeds, UK
4 Christie Education, Manchester, UK




Research Questions
1) How is the OSCE being implemented in Saudi medical schools? 
The aim of this research question is to examine the OSCE’s implementation in the Saudi context and explore whether the context matters for the OSCE’s implementation. 

2) What are the opportunities and challenges offered by adopting the OSCE in Saudi medical schools? 
This seeks to explore drivers, opportunities, and challenges in implementing high-quality OSCEs in Saudi Arabia (SA). 


Methodology 
A qualitative, in-depth case study approach was employed to gather rich narrative and documentary data, building on my constructivist perspective. Each case study gathered assessment documents and conducted interviews with administrative staff and focus groups with OSCE designers and examiners. Codebook analysis reviewed documents, while the interviews and focus group data were analysed using reflexive thematic analysis. 


Findings 
Four key themes were generated: institutional and assessment culture, faculty expertise and practices, OSCE quality and design, and resource and infrastructure setup. In both case studies, a central overarching theme of the entire data set is that each stage of OSCE implementation involves a series of dilemmas and compromises. In addition, several factors appear to influence how the OSCE is implemented in this context, including the source of medical school funding (public/private), accreditation status, faculty experience, and the availability of resources. Based on these findings, I developed a framework that was intended to assist the institutions in designing and maintaining an OSCE of high quality. 

Questions for discussion with participants 
What areas of this project would I need to consider developing further? 
If I could replicate this project elsewhere, what alternative data collection and analysis methods would you recommend? 
Considering the present research findings, which area warrants further investigation? 
Do you believe that collaborating with other countries in the region to analyse the OSCE implementation further would be beneficial? 



References (maximum three) 

Boursicot, K., Kemp, S., Wilkinson, T., Findyartini, A., Canning, C., Cilliers, F. and Fuller, R. 2020. Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference. Medical Teacher. 43(1), pp.58-67. 

Harden, R., Lilley, P. and Patricio, M. 2016. The Definitive Guide to the OSCE. London: Elsevier. 

Khan, K.Z., Ramachandran, S., Gaunt, K. and Pushkar, P. 2013. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: An historical and theoretical perspective. Medical Teacher. 35(9), pp.e1437-e1446. 

Sumathy M.K.1
Zayapragassarazan Zayabalaradjane1, Dinker Ramananda Pai2 and Mukta Wyawahare1
1 The Jawaharlal Institute of Postgraduate Medical Education & Research
2 Mahatma Gandhi Medical College and Research Centre, Puducherry ,India 



Background:
Technological advancements have elevated simulation technology from a teaching tool to a comprehensive learning management system, integrating debriefing, data collection, and analysis. 


Research Questions:
This study addresses two research questions: 1) Are the educational outcomes of a competency-based hybrid simulation training module observable and measurable, and 2) Will the competency-based hybrid simulation training module improve the educational outcomes of the curricular objectives for 3rd year undergraduate medical students? 


Methodology:
The study has three main objectives: 1) develop a simulation skill training module, 2) evaluate outcomes, and 3) assess the feasibility of curriculum incorporation. In a three-phase mixed-methods approach, Phase 1 used the modified Delphi method to identify skills and the ADDIE model to create procedural modules. These materials were shared via Google Classroom using a flipped classroom model. In Phase 2, Peyton's strategy was adopted for interactive simulation skill training for 3rd-year medical students during surgical postings. SimCapture facilitated data collection with OSCE checklists, pre/post-tests, perception tools, and debriefing videos. Ongoing Phase 2 data collection seeks educational insights, while Phase 3 involves assessing module effectiveness and potential modifications through focus group discussions. 


Findings So Far
In phase 1, essential skills are: basic life support, oxygen therapy, IV cannulation, nasogastric tube insertion, urinary catheterization, and suturing. Training of trainers, content validation, and tool assessments took place. The pilot study confirmed inter- rater reliability of the OSCE checklist, test-retest reliability of the knowledge questionnaire, and Cronbach's alpha for the perception tool. The ongoing phase 2 study aims to provide insights into learner profiles, future curriculum mapping, guide policymakers in aligning skill training modules, contribute to understanding simulation-based learning dynamics, and empower educators to embrace integrated teaching methods through SimCapture data. 


Keywords:
competency-based education, medical education, simulation training, procedural skills, curriculum enhancement. 



References (maximum three) 

  1. Medical Council of India. Assessment Module for Undergraduate Medical Education Training Program, 2019: pp 1-29. 

  2. Branson, R. K., Rayner, G. T., Cox, J. L., Furman, J. P., King, F. J., & Hannum, W. H. (1975). Interservice procedures for instructional systems development (Phases I, II, III, IV, V, and Executive Summary). US Army Training and Doctrine Command Pamphlet, 350. https://apps.dtic.mil/dtic/tr/fulltext/u2/a019486.pdf 

  3. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach. 2013 Oct;35(10):e1511-30. doi: 10.3109/0142159X.2013.818632. Epub 2013 Aug 13. PMID: 23941678. 

Mohammed Najjar1
Larry Gruppen2
1 The Ohio State University
2 University of Michigan




Research Questions
Will introducing inpatient pediatrics family centered round as an EPA promote self- determination among senior residents? What is the effect of this implementation on patients’ outcome? 


Thesis
Trust is the cornerstone for determining learners’ readiness for less supervised and more autonomous workplace clinical activities. In the clinical environment, interconnected factors related to the supervisor, trainee, supervisor trainee relationship, the context and the clinical task are essential for medical educators to build entrustment of their trainees (1). Autonomous and self-determined learners are intrinsically motivated to pursue their goals, are more satisfied with their work and lives, and ultimately become higher achievers (2). 

EPAs exemplify workplace assessment based on trust. This research attempts to build family centered rounds (FCRs) in inpatient general pediatric services as an EPA and assess its influence on supervisors’ entrustment, learners’ self-determination and patients’ safety. This study provides evidence of validity for EPA driven entrustment decisions on teaching rounds that are directly related to learners’ autonomy, competence and relatedness and to patients’ care and safety. 


Methods
Six experienced hospital pediatricians from four academic centers in the United States will collaborate to develop an EPA for inpatient pediatrics FCR according to ten Cate’s theory and guidelines (3). We will use faculty development programs to teach clinical preceptors the new assessment tool and its implementation on entrustment decisions and autonomy. We will apply qualitative research methods to study preceptors’ and learners’ perspective on entrustment and autonomy. We will use patients and family surveys and changes in organizational records for family complaints and medical errors to assess the influence on patients’ outcome. 

Questions for discussion with participants: 

  • How to enhance validity and reliability for this EPA? 

  • Best way to decrease inter rater variability and reach higher level of entrustment? 

  • Do you suggest other approaches to evaluate the effect on patient’s care? 


References (maximum three) 

  • 1- Hauer KE, Ten Cate O, Boscardin C, Irby DM, Iobst W, O'Sullivan PS. Understanding trust as an essential element of trainee supervision and learning in the workplace. Adv Health Sci Educ Theory Pract. 2014 Aug;19(3):435-56. doi: 10.1007/s10459-013-9474-4. Epub 2013 Jul 27. PMID: 23892689. 

  • 2- Ryan, D. a. (2004). Handbook of Self-Determination research (E. D. a. R. Ryan Ed.): The university of rochester press. 

  • 3- ten Cate O, Schwartz A, Chen HC. 2020. Assessing trainees and making entrustment decisions: on the nature and use of entrustment-supervision scales. Acad Med. 95(11):1662– 1669. 

William Holmes1
Jyoti Paul1, Claire Mallon1, Fiona Cruickshank1 and Catherine Porter1
1 The University of Manchester



Optometrists are healthcare professionals specialised in eye care. The current UK route to becoming an optometrist involves completing a BSc programme followed by a 1-year full-time clinical placement. During this placement, work-based assessments and OSCEs are conducted by the the College of Optometrists. The regulatory body overseeing optometry in the UK, introduced new standards and outcomes for programmes in 2021. These provide an opportunity to re-think how we assess clinical competence at the University of Manchester. 

We plan to use Entrustable Professional Activities (EPAs) to assure the appropriate level of competence for progressing between years of study and for the terminal decision regarding registration for independent practice. To our knowledge this would be the first time EPAs have been used in optometry. Our research questions are as follows 1) What are the EPAs for UK optometric practice? 2) What level of entrustment should be gained for progression and registration? 3) Should a different set of smaller EPAs be used in early year studies 4) What type and volume of data will the competence committee need in order to make progression and registration decisions. 

For questions 1 to 3 our plan is to use a small team of academics and clinicians to draft EPAs and suggested levels according to AMEE guide 140 (Ten Cate and Taylor, 2021) and then refine these with a wider group using the Delphi method according to AMEE guide 111 (Humphrey-Murto et al., 2016). As of August 2023 we have started the process of drafting the terminal EPAs (n=13). In the next 6 months we hope to have completed the drafting and have the Delphi stage under way. We would welcome the opportunity to discuss our overall approach, alternative suggestions for verifying the draft EPAs, how to operationalise them and methods for addressing research question 4. 




References (maximum three) 

Ten Cate, O., & Taylor, D. R. (2021). The recommended description of an entrustable professional activity: AMEE Guide No. 140. Medical teacher, 43(10), 1106–1114. https://doi.org/10.1080/0142159X.2020.1838465 

Humphrey-Murto, S, Varpio, L., Gonsalves, C. & Wood, T.J. (2017) Using consensus group methods such as Delphi and Nominal Group in medical education research, Medical Teacher, 39:1, 14-19, DOI: 10.1080/0142159X.2017.1245856