Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Progress testing
Oral Presentation
Oral Presentation
4:00 pm
27 February 2024
M204
Session Program
4:00 pm
Gergo Pinter1
Daniel Zahra2, Steven Burr1, Thomas Gale2, Jolanta Kisielewska1, José Miguel Gomes Moreira Pêgo3 and Nuno Santos4
1 University of Plymouth
2 Peninsula Medical School
3 University of Minho
4 iCognitus4ALL - IT Solutions, Braga
Daniel Zahra2, Steven Burr1, Thomas Gale2, Jolanta Kisielewska1, José Miguel Gomes Moreira Pêgo3 and Nuno Santos4
1 University of Plymouth
2 Peninsula Medical School
3 University of Minho
4 iCognitus4ALL - IT Solutions, Braga
Peninsula Medical School (PMS) is pioneering the introduction of Content Adaptive Progress Testing (CAPT) [1,2] for new students in 2023. Unlike traditional Computerised Adaptive Testing (CAT) [2] assessments that adapt during the exam by adjusting difficulty levels, CAPT adopts a new approach by adapting the content of questions in between assessments, tailoring each test to individual students based on their past performance. The CAPT approach was specifically designed to improve longitudinal progress testing for students and educators, meeting national licensing exam requirements.
Each CAPT assessment consists of 125 personalised non-negatively marked multiple-choice questions. Questions are mapped to one of over 600 topics aligned with the UK General Medical Council (GMC) framework (gmc-uk.org/mla), promoting a comprehensive understanding of medical knowledge. Question selection rules prioritise fairness, balance, and attainability while quantifying student performance based on the number of unique correctly answered topics, eliminating norm-referencing.
We present our experience of developing and delivering cohesive and student-centric CAPT assessments which build on the benefits of traditional identical question tests such as bidirectional navigation and a frequent but low-stake nature by eliminating compensation within and across assessments. CAPT's asynchronous delivery capability also offers a distinct advantage over non-adaptive methods with rising student numbers. However implementing fair question setting and progression decisions required new, more complex processes.
Truly personalised assessments benefit students by providing incredibly detailed progression information, feedback and remediation options to guide their learning. Educators also benefit from greater teaching insights and reduced question-setting workloads. Finally, the general public also stands to gain from the CAPT method since it promotes the training of safe and generalist foundation doctors.
The personalised CAPT approach gives rise to a plethora of benefits for students and educators alike alongside superior alignment to learning outcomes goals and is a compelling candidate for longitudinal assessments.
References (maximum three)
[1] Burr SA, Gale T, Kisielewska J, Millin P, Pêgo JM, Pinter G, et al. A narrative review of adaptive testing and its application to medical education [Internet]. Vol. 13, MedEdPublish. F1000 Research Ltd; 2023. p. 221. Available from: http://dx.doi.org/10.12688/mep.19844.1
[2] Burr SA, Kisielewska J, Zahra D, Hodgins I, Robinson I, Millin P, et al. Personalising knowledge assessments to remove compensation and thereby improve preparation for safe practice - developing content adaptive progress testing [Internet]. Research Square Platform LLC; 2022. Available from: http://dx.doi.org/10.21203/rs.3.rs-1977511/v1
[3] Rice N, Pêgo JM, Collares CF, Kisielewska J, Gale T. The development and implementation of a computer adaptive progress test across European countries [Internet]. Vol. 3, Computers and Education: Artificial Intelligence. Elsevier BV; 2022. p. 100083. Available from: http://dx.doi.org/10.1016/j.caeai.2022.100083
4:15 pm
Richard Arnett1,2
Muríosa Prendergast1
1 RCSI University of Medicine and Health Sciences
2 Surgical Royal Colleges of the United Kingdom and in Ireland
Muríosa Prendergast1
1 RCSI University of Medicine and Health Sciences
2 Surgical Royal Colleges of the United Kingdom and in Ireland
1. Background
RCSI is in the process of implementing its ‘Transforming Healthcare Education Project (THEP)’ which has focussed on the creation of a new curriculum for undergraduate medicine. One of the new assessment tools being used is a form of Progress Testing.
RCSI is in the process of implementing its ‘Transforming Healthcare Education Project (THEP)’ which has focussed on the creation of a new curriculum for undergraduate medicine. One of the new assessment tools being used is a form of Progress Testing.
2. Summary of work
The RCSI version of Progress Testing involves 4 diets per year each consisting of 160 MCQs each with 3 options plus a ‘Don’t Know’ option. 60 Items in each diet are focussed on content from the current stage of study and 100 items are focused on graduation-level content. Penalty scoring is applied with +1 (correct), -0.5 (incorrect), and 0 (Don’t Know). By February 2024, these students will be just over halfway through their second year of Progress Testing.
3. Results
Predictably, initial feedback from students (& staff) to this new form of assessment has been mixed. Results from the first few diets showed very little progress with many students choosing to ignore the more advanced content. Additional communication was provided towards the end of the first year, resulting in a small but noticeable increase in performance as students became more familiar with the format.
4. Discussion
This is a new format for most students, and it will take time to bed down. The main benefits include more frequent, better-quality feedback and real-time estimates of their progress toward their ultimate goal (final-year knowledge competence).
5. Conclusions
Despite a great deal of preparation, additional support & communication was necessary to ensure students fully understood this new assessment format and how best to approach it.
6. Take-home messages/implications for further research or practice
Communication is key. Students & staff need to be prepared for this new type of assessment and this communication needs to be maintained throughout the application.
References (maximum three)
Schuwirth LWT, Vleuten CPM van der. The use of progress testing. Perspect Méd Educ. 2012;1(1):24–30
McHarg J, Bradley P, Chamberlain S, Ricketts C, Searle J, McLachlan JC. Assessment of progress tests. Medical Education. 2005;39(2):221–7
Vleuten C van der, Freeman A, Collares C. Progress test utopia. Perspectives on Medical Education. 2018;7(2):136–8
4:30 pm
Carlos Gomez-Garibello1
Maryam Wagner2 and Paola Fata2
1 Institute of Health Sciences Education - McGill University
2 McGill University
Maryam Wagner2 and Paola Fata2
1 Institute of Health Sciences Education - McGill University
2 McGill University
Background:
The Canadian Association of General Surgeons (CAGS) is a mandatory surgical knowledge exam for general surgery residents in Canada. This formative exam assesses the core knowledge required by trainees across different domains of surgery. The purposes of this study were to evaluate: i) the differences in surgical knowledge of residents across training years; and ii) longitudinal trends and development of knowledge over time.
The Canadian Association of General Surgeons (CAGS) is a mandatory surgical knowledge exam for general surgery residents in Canada. This formative exam assesses the core knowledge required by trainees across different domains of surgery. The purposes of this study were to evaluate: i) the differences in surgical knowledge of residents across training years; and ii) longitudinal trends and development of knowledge over time.
Summary of the Work
Residents’ performance data were analyzed both cross-sectionally and longitudinally, and across different surgical domains. Participants’ perceptions of the exam are collected using a post-exam survey. Exam results are used to foster residents’ learning through: 1) provision of a ‘report card’ detailing individualized learner’ exam performance, and program directors (who receive aggregated, anonymized data); and 2) delivery of a series of online interactive sessions between test-takers and expert surgeons reviewing test items.
Results
The analyses revealed that the test successfully measures the progression of surgical knowledge and differentiates performance between junior and senior residents. The longitudinal analyses revealed that for learners who have taken the exam during the last five years, their overall performance improved progressively. In addition, the test also provided domain-specific information about residents’ strengths and areas for improvement across specific domains of surgery. Finally, residents perceived the exam as a positive resource to prepare for their certification examination.
Discussion and Conclusion
These findings reveal that the CAGS exam has great potential to advance residency education by allowing residents to monitor their knowledge development and providing program directors information about their residents’ progress across specific domains, so they can make adjustments to their curricula.
Take home messages:
The CAGS exam exemplifies the ways in which learners can benefit from a formative exam. Additionally, this exam may serve as a model to integrate nation-wide formative examinations into program curricula.
References (maximum three)
Wrigley W, Van Der Vleuten CP, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE Guide No. 71. Medical teacher. 2012 Sep 1;34(9):683-97.
Fulcher G, Davidson F. Language testing and assessment. London and New York: Routledge; 2007 Jan.
4:45 pm
Andy Wearn1
Sanjeev Krishna, Michael Chieng, Shomel Gauznabi, George Shand and Nathan Ryckman1
1 University Of Auckland
Sanjeev Krishna, Michael Chieng, Shomel Gauznabi, George Shand and Nathan Ryckman1
1 University Of Auckland
Background:
Medical students are expected to learn and gain competencies in applied clinical knowledge. Progress testing (PT) is intended to assist in knowledge acquisition and promote ongoing recall and review. This study explored student preparation for PT, relationships between approach and performance, and patterns of support.
Medical students are expected to learn and gain competencies in applied clinical knowledge. Progress testing (PT) is intended to assist in knowledge acquisition and promote ongoing recall and review. This study explored student preparation for PT, relationships between approach and performance, and patterns of support.
Summary:
A cross-sectional survey exploring study approach and individual student context was designed. Students self-reported their aggregate grade for each completed year. The survey was sent to all clinical students at two sites (n=297). A positivist approach was taken for quantitative data, and a constructivist view for qualitative data.
Results:
129 students responded (43.4%). Most had stable performance over time, whilst 22 students had improving or deteriorating aggregate grades. Most students reported doing regular weekly background study (87.0%). Almost all students accessed basic feedback on their PT performance (94.1%), but few accessed the learning points and linked resources (12.7%). Poor early PT performance was associated with having an improvement strategy (X2 6.954, p=0,008). Students never falling below satisfactory were less likely to have an improvement strategy (X2 10.084, p=0.001). A third of students reported events that impacted their study, and accessing pastoral care was associated with poorer performance (X2 4.701, p=0.030).
Discussion & Conclusions:
Student approaches to PT preparation are diverse and have variable efficacy. Students who performed poorly early in the programme and had targeted support, improved performance over time. Using question banks alone was insufficient to impact results without additional approaches. External impacts to performance were common. The feedback provided on their individualised dashboard was underutilised, suggesting a need to improve feedback literacy and ensure that information provided is fit-for-purpose.
Take-home:
1. Targeted support helped students to develop strategies to improve
2. Non-academic support also plays a part in performance
3. Best intentions for individualised feedback may not meet student need