Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Development of remote online assessment
Oral Presentation
Oral Presentation
2:00 pm
27 February 2024
M211
Session Program
2:00 pm
James Fraser1
1 Australian College of Rural and Remote Medicine
1 Australian College of Rural and Remote Medicine
The Australian College of Rural and Remote Medicine (ACRRM) was established in 1999 and is one of two colleges that provide General Practice training in Australia. ACRRM Fellows complete a four year rural generalist training program that equips them with the knowledge, primary care and advanced skills to provide patient care in rural and remote Australia. Graduates are Rural Generalists, who are formally recognised as specialist practitioners providing a combination of extended general practice and additional expertise in a sub- specialty that rural communities need but cannot support in a full-time sub-specialist roster.
The ACRRM Fellowship assessment program has a programmatic design that is wholly delivered remotely. The assessment program utilises a variety of modalities including Multiple Choice Questions, Multi Source Feedback, mini-CEX, Case Based Discussions and a procedural logbook. ACRRM pioneered the development of a remotely-delivered, multi- station clinical assessment in 2008 and have run this once or twice yearly since (Sen Gupta et al, 2021). This assessment method, known as Structured Assessment of Multiple Patient Scenarios (StAMPS), consists of 8 ‘stations’ where candidates explore standardised clinical scenarios and respond to a number of questions by a sole examiner, with limited prompting. These scenarios are set in a standardized hypothetical location, “Stampsville” which has a defined community profile. The content of the assessments is developed by practicing rural generalists and focusses on assessing clinical practice in authentic rural environments.
The remote delivery of all assessment removes the need for trainees and examiners to travel to large regional towns or major cities to sit assessment thus avoiding removing them from the medical workforce of their hometowns. This presentation will discuss the ACRRM assessment program with focus on the processes supporting remote delivery.
References (maximum three)
Smith JD and Hays RB. Is rural medical practice a separate discipline? Australian Journal of Rural Health, 2004; 12: 67-72.
Sen Gupta T, Campbell D, Chater AB et al. Fellowship of the Australian College of Rural & Remote Medicine (FACRRM) Assessment: a review of the first 12 years [version 1] MedEdPublish 2020, 9:100 https://doi.org/10.15694/mep.2020.000100.1
Sen Gupta, Tarun, Wong E, Doshi D, Hays RB. . "'Stability' of Assessment: Extending the Utility Equation." MedEdPublish 10 (2021).
2:15 pm
Chris Plummer1,2
Danny Mathysen3, Stephanie Thibault1 and Clive Lawson4
1 European Society of Cardiology
2 Union of European Medical Specialists Cardiology Section 3 University of Antwerp
4 Maidstone and Tunbridge Wells NHS Trust
Danny Mathysen3, Stephanie Thibault1 and Clive Lawson4
1 European Society of Cardiology
2 Union of European Medical Specialists Cardiology Section 3 University of Antwerp
4 Maidstone and Tunbridge Wells NHS Trust
Background
The EECC[1] is a summative test of core cardiology knowledge[2], comprising 120 best-of- five multiple choice questions, with a Hofstee derived pass mark[3]. From 2012 to 2019, the EECC was delivered in test centres. The pandemic made this impossible in 2020, so exam preparation meetings were on-line, and delivery was on-line with remote proctoring.
Summary of work
We analysed data from the delivery and proctoring suppliers, candidate satisfaction questionnaires, costs and meeting records for 3 years before, and 4 years after transition.
Results
A total of 5267 cardiology trainees have sat the EECC – 2489 in-person, and 2778 on-line since 2020. Comparing in-person (2019) and on-line delivery (2023), the number of participating national societies has more than doubled (16 to 44), and candidates have increased by 51% (575 to 868). Costs have fallen by over €100 per candidate for preparation, a total of >€86,800 in 2023, because of reduced travel and accommodation costs, and delivery is €21 lower per candidate, €18,228 in 2023, with further savings for candidates in reduced travel. The proportion of candidates not attending the exam has fallen from 7.3% to 2.2% and the proportion with technical difficulties has fallen from 12% to 2.2%. There has been no significant change in mean pass mark (66.7 vs 66.8) or pass rate (80.0% vs 82.8%). There was no change in the candidates’ satisfaction with the examination software, and rigorous monitoring has not identified any attempts at misconduct.
Discussion
The transition to on-line preparation and delivery of the EECC was necessitated by the COVID- 19 pandemic, but we are pleased to find that this has allowed widened participation, improved quality and reduced costs.
Take-home messages
High-stakes exams can safely be prepared and delivered on-line. However, face-to-face meetings at least once each year, are highly desirable to build and maintain long-term teams.
References (maximum three)
[1] European Exam in Core Cardiology (EECC)
Available at: https://www.escardio.org/Education/Career-Development/European-Exam-in- Core-Cardiology-(EECC)
Accessed: 28 June 2023.
Available at: https://www.escardio.org/Education/Career-Development/European-Exam-in- Core-Cardiology-(EECC)
Accessed: 28 June 2023.
[2] Felix Tanner, Nicolas Brooks, Kevin Fox, Lino Gonçalves, Peter Kearney, Lampros Michalis, Agnès Pasquet, Susanna Price, Eric Bonnefoy, Mark Westwood, Chris Plummer, Paulus Kirchhof.
ESC Core Curriculum for the Cardiologist
ESC Core Curriculum for the Cardiologist
European Heart Journal 2020;41:3605-3692. https://doi.org/10.1093/eurheartj/ehaa641
[3] Chris Plummer, Sarah Bowater, Jim Hall, Clive Lawson, Georgina Ooues, Susanna Price, Russell Smith, Ian Wilson, Rob Wright.
Behind the scenes of the European Examination in General Cardiology
Heart 2019;105:889-890.
Behind the scenes of the European Examination in General Cardiology
Heart 2019;105:889-890.
http://dx.doi.org/10.1136/heartjnl-2018-314495
Gerrard Phillips1
Chris McManus2 and Liliana Chis3
1 Executive Medical Director, The Federation of Royal Colleges of Physicians of the United Kingdom
2 Emeritus Professor, Research Department, University College London
3 MRCP(UK)
Chris McManus2 and Liliana Chis3
1 Executive Medical Director, The Federation of Royal Colleges of Physicians of the United Kingdom
2 Emeritus Professor, Research Department, University College London
3 MRCP(UK)
Background:
MRCP(UK) PACES examination assesses the clinical knowledge, behaviours and skills of trainee doctors who aim to enter physician higher specialist training. Seven skills are assessed in five stations (1-5): 1:Respiratory and Abdominal examinations, 2:History taking, 3:Cardiovascular and Neurological examinations, 4:Communication skills and ethics, 5:Two brief clinical consultations. Pre-COVID, assessments were delivered in-person in a hospital ward or postgraduate education centre, using patients and surrogates. During the pandemic, the format of stations 2&4 was changed to remote assessment to introduce social distancing. Candidates were examined in a separate same-day mini-carousel via a video link to the surrogates and examiners. The other stations remained unchanged.
MRCP(UK) PACES examination assesses the clinical knowledge, behaviours and skills of trainee doctors who aim to enter physician higher specialist training. Seven skills are assessed in five stations (1-5): 1:Respiratory and Abdominal examinations, 2:History taking, 3:Cardiovascular and Neurological examinations, 4:Communication skills and ethics, 5:Two brief clinical consultations. Pre-COVID, assessments were delivered in-person in a hospital ward or postgraduate education centre, using patients and surrogates. During the pandemic, the format of stations 2&4 was changed to remote assessment to introduce social distancing. Candidates were examined in a separate same-day mini-carousel via a video link to the surrogates and examiners. The other stations remained unchanged.
Summary of work:
The main question asked whether totalled scores for Skill C:Clinical Communication, and Skill F:Managing Patients’ Concerns, differed for stations administered remotely during COVID compared with in-person stations from before COVID. Participants were UK trainees sitting PACES for the first time, 5,274 in 2017-2019 (pre-COVID) and 2,074 during the pandemic.
The main question asked whether totalled scores for Skill C:Clinical Communication, and Skill F:Managing Patients’ Concerns, differed for stations administered remotely during COVID compared with in-person stations from before COVID. Participants were UK trainees sitting PACES for the first time, 5,274 in 2017-2019 (pre-COVID) and 2,074 during the pandemic.
Results:
Overall scores on stations 2&4 and 5 were slightly but non-significantly lower during COVID, but as is usual in PACES, scores on station 5 were higher than those on stations 2&4. The key analysis used ANOVA to show that there was no significant interaction of station by COVID (p=0.852), indicating that testing stations 2&4 remotely during COVID, compared with station 5 which remained in-person, did not alter candidates relative scores.
Overall scores on stations 2&4 and 5 were slightly but non-significantly lower during COVID, but as is usual in PACES, scores on station 5 were higher than those on stations 2&4. The key analysis used ANOVA to show that there was no significant interaction of station by COVID (p=0.852), indicating that testing stations 2&4 remotely during COVID, compared with station 5 which remained in-person, did not alter candidates relative scores.
Discussion:
COVID did not alter the pattern of scores on communication skills when stations 2&4 were tested remotely as compared with being tested in-person as a part of the normal carousel.
COVID did not alter the pattern of scores on communication skills when stations 2&4 were tested remotely as compared with being tested in-person as a part of the normal carousel.
Conclusion:
Remote assessment of clinical skills in communication stations had little impact on UK trainees’ performance.
Remote assessment of clinical skills in communication stations had little impact on UK trainees’ performance.
Take-home messages:
Remote assessment of clinical skills is practical under special circumstances but it should be fit for purpose ensuring that high standards are maintained.
Remote assessment of clinical skills is practical under special circumstances but it should be fit for purpose ensuring that high standards are maintained.
References (maximum three)
1.S. Lara, C.W. Foster, M. Hawks and M. Montgomery. “Remote Assessment of Clinical Skills During COVID-19: A Virtual, High-Stakes, Summative Pediatric Objective Structured Clinical Examination.” Acad Pediatr. 2020 Aug; 20(6): 760–761. Published online 2020 Jun 5. doi: 10.1016/j.acap.2020.05.029 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7273144/
2:45 pm
Sink or Swim: developing an agile, competency based assessment in response to the pandemic.
Gary Butler - National Clinical Lead For Assessment - Royal Australian College of General Practitioners, Adareeka Jayasinghe - Medical Educator (Assessment Development) - Royal Australian College of General Practitioners, Rebecca Lock - National Assessment Advisor - CCE and National Lead Medical Educator - Adf - Racgp
Gary Butler1
Rebecca Lock1 and Adareeka Jayasinghe1
1 RACGP
Rebecca Lock1 and Adareeka Jayasinghe1
1 RACGP
The pandemic required adaptive thinking to ensure that the pipeline of Fellowed GPs, fit for unsupervised practice into the Australian general practice workforce was not disrupted. The RoyalAustralianCollege ofGeneralPractitioners(RACGP)pivotedfromafacetofaceOSCE into a remote, online and live assessment. The RACGP was one of the only Australian speciality colleges to continue with its full assessment portfolio throughout the whole pandemic.
The College was reviewing the clinical exams with the intention of progressing from a checklist OSCE to a competency based assessment, the pandemic catalysed this process.
The Clinical Competency Exam (CCE) is a multi-station examination that samples candidate competency across the RACGP Curriculum focusing on the application of clinical management skills in the context of Australian general practice. It is the final hurdle assessment prior to candidates obtaining fellowship.
The CCE represents an agile, contemporary, evidence-based examination. Extensive consultation with leading international medical assessment experts informed the development of this examination, ensuring that the assessment is valid and reliable, and enabling the RACGP to assess candidates’ clinical competence in multiple contexts, upholding the standard of Fellowship. The assessment delivery means that it can flex to national and local adverse events, not just the pandemic.
The CCE was developed in 2019 following a decision to review and update the longstanding Objective Structured Clinical Examination (OSCE). The COVID-19 pandemic accelerated the development of this exam with a transitional Remote Clinical Exam (RCE). The successful delivery of the RCE demonstrated that the “Show’s How” of Miller’s pyramid could be assessed virtually without compromising the attributes and quality of a face-to-face exam.
This presentation will showcase the theory, development, and implementation of one of the largest successful online competency assessments worldwide. Developed and delivered in pandemic conditions and now to become the benchmark final clinical assessment of the RACGP fellowship.
The CCE has been observed internationally and our team continues to work with international partnerships to help our peers create contemporary assessments in primary care worldwide.
This symposium is designed to look at the process of shifting from an established clinical assessment to a very different model and mode of delivery and reflect and share on the lessons learned, the ongoing evolution of the assessment in response to quality assurance and evaluation. In doing so, encourage those on a similar journey and motivate those looking to change established assessments.
References (maximum three)
William F. Iobst, Jonathan Sherbino, Olle Ten Cate, Denyse L. Richardson, Deepak Dath, Susan R. Swing, Peter Harris, Rani Mungroo, Eric S. Holmboe, Jason R. Frank & for the International CBME Collaborators (2010) Competency-based medical education in postgraduate medical education, Medical Teacher, 32:8, 651-656, DOI:10.3109/0142159X.2010.500709
Margery H. Davis & Ronald M. Harden (2003) Editorial Competency based assessment: making it a reality, Medical Teacher, 25:6, 565-568, DOI:10.1080/0142159032000153842
Shumway JM, Harden RM; Association for Medical Education in Europe. AMEE Guide No. 25: The assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003 Nov;25(6):569-84. doi: 10.1080/0142159032000151907. PMID: 15369904.
3:15 pm
A Curtis Lee1
Renate Fellinger1, Imogene Rothnie2 and Libby Newton1
1 Royal Australasian College of Physicians
2 ANZAHPE, AES
Renate Fellinger1, Imogene Rothnie2 and Libby Newton1
1 Royal Australasian College of Physicians
2 ANZAHPE, AES
Background
COVID-19 impacts and responses varied within Australian and Aotearoa New Zealand jurisdictions. Following the principles of the utility index of assessment, the Royal Australasian College of Physicians (RACP) adapted the delivery of the clinical examinations to incorporate improvements whilst maintaining the intent and robustness of assessment processes.
COVID-19 impacts and responses varied within Australian and Aotearoa New Zealand jurisdictions. Following the principles of the utility index of assessment, the Royal Australasian College of Physicians (RACP) adapted the delivery of the clinical examinations to incorporate improvements whilst maintaining the intent and robustness of assessment processes.
Summary of work
Given the need to adapt the assessment system due address COVID-19 issues, RACP reviewed and revised our clinical examination to address requirements of the pandemic while keeping the purpose of the assessment in mind. Changes addressed travel restrictions, social distancing, hospital access, wellness/mental health challenges, restrictions to movement and assembly, and impact on candidate training. Strategies were developed to maintaintheabilitytoassessifcandidatesmetthestandardsoffellowship. Modificationswere selectively implemented to accommodate requirements yet maintain test integrity. These included modularising the exam, multiple sittings, hybridising delivery methods, bolstered by digital marking innovations.
Given the need to adapt the assessment system due address COVID-19 issues, RACP reviewed and revised our clinical examination to address requirements of the pandemic while keeping the purpose of the assessment in mind. Changes addressed travel restrictions, social distancing, hospital access, wellness/mental health challenges, restrictions to movement and assembly, and impact on candidate training. Strategies were developed to maintaintheabilitytoassessifcandidatesmetthestandardsoffellowship. Modificationswere selectively implemented to accommodate requirements yet maintain test integrity. These included modularising the exam, multiple sittings, hybridising delivery methods, bolstered by digital marking innovations.
Results
Consistency was found based on psychometric and quality assurance review. Feedback from candidates and examiners was mostly positive regarding the acceptability of the changes. Feedback confirmed regional variations in COVID-19 impacts on learning opportunities and examination preparation, corroborating the RACP’s approach to adjusting examination delivery.
Consistency was found based on psychometric and quality assurance review. Feedback from candidates and examiners was mostly positive regarding the acceptability of the changes. Feedback confirmed regional variations in COVID-19 impacts on learning opportunities and examination preparation, corroborating the RACP’s approach to adjusting examination delivery.
Conclusion
The diversity of adversity experienced throughout the COVID-19 pandemic and success of our adaptations to examination delivery underscore the importance of robust yet flexible assessment. The RACP translated its learnings into the adoption of technology to better emphasise the need for assessment not only to be responsive to context in general but specifically to challenging external factors.
The diversity of adversity experienced throughout the COVID-19 pandemic and success of our adaptations to examination delivery underscore the importance of robust yet flexible assessment. The RACP translated its learnings into the adoption of technology to better emphasise the need for assessment not only to be responsive to context in general but specifically to challenging external factors.
Take-home messages/implications for practice
Assessment frameworks and policies should be sufficiently flexible to mitigate future disruptions while maintaining assessment validity and fairness.
Thoughtful improvements made during COVID have had lasting impact on assessment including the introduction of technology, consideration for communication and impact on candidates during change.
References (maximum three)
None