Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Assessment in Postgraduate / Post-registration training
Oral Presentation
Oral Presentation
10:00 am
28 February 2024
M213
Session Program
10:00 am
Annette Burgess1
Harish Tiwari2, Tyler Clark, Alexandra Green2, Jenny-Ann Toribio2, , Meg Vost2, and Navneet Dhand2,
1 The University of Sydney, Sydney Medical School
2 The University of Sydney
Harish Tiwari2, Tyler Clark, Alexandra Green2, Jenny-Ann Toribio2, , Meg Vost2, and Navneet Dhand2,
1 The University of Sydney, Sydney Medical School
2 The University of Sydney
Background
In 2022, The Asia Pacific Consortium of Veterinary Epidemiology (APCOVE) delivered 36 elearning modules, developed to support the training of field veterinarians in the region. Key competencies included: outbreak investigation and surveillance, data analysis, risk analysis, One Health, biosecurity and leadership. We explored the effectiveness of the program.
Summary of Work:
Delivered across 6-months, 139 veterinarians from 7 countries enrolled. Quantitative and qualitative data were collected by post-module, and pre- and post-competency questionnaires, and focus groups. Quantitative data was analysed using descriptive statistics. Qualitative data were analysed using thematic analysis. Knowledge acquisition was assessed for each competency.
Reslts
93/139 (67%) trainees completed all competencies. Seventy-four trainees from Philippines (n=28), Indonesia (n=18), Vietnam (n=16), Cambodia(n=3), PNG (n=3), Laos (n=3), Timor Leste (n=3) participated in focus groups. Trainees reported vast improvements in their perceived level of knowledge and skills, and application in the workplace. They valued the interactivity, knowledge checks, case scenarios and videos. They suggested the inclusion of local face-to-face sessions to complement online delivery and use of local language. The median assessment task score ranged between 85-90%.
Discussion
Our findings demonstrate that the APCOVE elearning program provided veterinarians with an excellent framework to develop their epidemiology skills. While geographical barriers to participation were mitigated by online delivery, the inclusion of face-to-face sessions, with opportunities for workplace practice should be considered. This will help to widen participation, increase engagement and build networks.
Our findings demonstrate that the APCOVE elearning program provided veterinarians with an excellent framework to develop their epidemiology skills. While geographical barriers to participation were mitigated by online delivery, the inclusion of face-to-face sessions, with opportunities for workplace practice should be considered. This will help to widen participation, increase engagement and build networks.
Conclusion
All modules will be made available internationally, and have been translated into 5 languages, ensuring that quality training resources are available to the animal health workforce. The addition of interactive synchronous sessions to guide trainees, answer their queries and build networks would be beneficial.
Take-home message
The APCOVE Field Veterinary Training program is evidence-based and available to help strengthen field veterinary epidemiology capacity in the Asia-Pacific region.
References (maximum three)
Asia Pacific Consortium of Veterinary Epidemiology (APCOVE). Sydney, Australia; 2021. www.apcove.com.au (accessed 3 May 2022).
Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77– 101. https://doi.org/10.1191/ 1478088706qp063oa
Jones DS, Dicker RC, Fontaine RE, Boore AL, Omolo JO, Ashgar RJ, Baggett HC. Building global epidemiology and response capacity with field epidemiology training programs. Emerging infectious diseases.2017; 23 (Suppl 1):S158.
10:15 am
Shelley Ross1
Zoe Brody1, Mawada Tarhuni1, Tarleen Dhanoa1, Todd Milford2, Darren Nichols1 and Shirley Schipper1
1 University of Alberta
2 University of Victoria
Zoe Brody1, Mawada Tarhuni1, Tarleen Dhanoa1, Todd Milford2, Darren Nichols1 and Shirley Schipper1
1 University of Alberta
2 University of Victoria
Background:
Effective and trustworthy assessment in competency-based medical education requires the development and implementation of well-designed programmatic assessment frameworks. In 2010, we implemented the Competency-Based Achievement System (CBAS), which includes collection of multiple workplace-based formative assessments. We use learning analytics to continuously improve our learning environment and approaches to teaching and assessment, including evaluation of CBAS. In this study, we looked specifically at two evaluation concerns: 1) To what extent does CBAS capture assessment information across all of the competencies of family medicine; and 2) To what extent does CBAS show stability over time (i.e., consistency of use and engagement from teachers and learners)?
Effective and trustworthy assessment in competency-based medical education requires the development and implementation of well-designed programmatic assessment frameworks. In 2010, we implemented the Competency-Based Achievement System (CBAS), which includes collection of multiple workplace-based formative assessments. We use learning analytics to continuously improve our learning environment and approaches to teaching and assessment, including evaluation of CBAS. In this study, we looked specifically at two evaluation concerns: 1) To what extent does CBAS capture assessment information across all of the competencies of family medicine; and 2) To what extent does CBAS show stability over time (i.e., consistency of use and engagement from teachers and learners)?
Summary of work:
Using ten years of learner analytics data (workplace-based assessment data called FieldNotes), we conducted a retrospective cohort secondary data analysis. Data were analyzed using descriptives and data visualization to identify trends within and across academic years.
Using ten years of learner analytics data (workplace-based assessment data called FieldNotes), we conducted a retrospective cohort secondary data analysis. Data were analyzed using descriptives and data visualization to identify trends within and across academic years.
Results:
Engagement with CBAS increased from 2010 (5208 FieldNotes) to 2016 (6764), then plateaued until 2020. FieldNotes dropped in 2021 (5525) and 2022 (5207). Distribution of FieldNotes across competencies has been highly consistent across all years: approximately 35% (2000-2500) FieldNotes are about diagnosis and management. The remainder are about patient-centered care (13%), procedural skills (17%), communication (10%), professionalism (6%), and teaching skills (6% each). The final two competencies are related to adaptive expertise (6%) and self-regulated learning (3%).
Engagement with CBAS increased from 2010 (5208 FieldNotes) to 2016 (6764), then plateaued until 2020. FieldNotes dropped in 2021 (5525) and 2022 (5207). Distribution of FieldNotes across competencies has been highly consistent across all years: approximately 35% (2000-2500) FieldNotes are about diagnosis and management. The remainder are about patient-centered care (13%), procedural skills (17%), communication (10%), professionalism (6%), and teaching skills (6% each). The final two competencies are related to adaptive expertise (6%) and self-regulated learning (3%).
Discussion:
Our findings indicate that CBAS has been a highly stable assessment framework, with good engagement until very recently. Even though we saw decreases in FieldNotes entries in 2021 and 2022, workplace-based assessment of learners continues to address a variety of competencies, coming from multiple observers.
Our findings indicate that CBAS has been a highly stable assessment framework, with good engagement until very recently. Even though we saw decreases in FieldNotes entries in 2021 and 2022, workplace-based assessment of learners continues to address a variety of competencies, coming from multiple observers.
Conclusions:
While CBAS has good stability, attention must be paid to the recent declines in FieldNote entries.
While CBAS has good stability, attention must be paid to the recent declines in FieldNote entries.
Future research:
We are planning a more in-depth examination of CBAS, using more learning analytics data.
We are planning a more in-depth examination of CBAS, using more learning analytics data.
References (maximum three)
1. van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005; 39: 309–17.
2. Ross S, Lawrence K, Bethune C, van der Goes T, Pélissier-Simard L, Donoff M, Crichton T, Laughlin T, Dhillon K, Potter M, Schultz K. Development, implementation, and meta- evaluation of a national approach to programmatic assessment in family medicine residency training. Academic Medicine 2023; 98(2):188-198
3. Ross S, Poth C, Donoff M, Humphries P, Steiner I, Schipper S, Janke F, Nichols D. The Competency-Based Achievement System (CBAS): Using formative feedback to teach and assess competencies with Family Medicine residents. Canadian Family Physician. 2011;57:e323-e330.
10:30 am
David Kok1,2,3
Kristie Matthews2, Caroline Wright2 and Steve Trumble3
1 Peter MacCallum Cancer Centre
2 Monash University
3 University of Melbourne
Kristie Matthews2, Caroline Wright2 and Steve Trumble3
1 Peter MacCallum Cancer Centre
2 Monash University
3 University of Melbourne
Background
Assessment tasks in Health Professional Education (HPE) require regular renewal to stay contemporary in the constantly evolving educational landscape. In particular, the last 5 years have seen notable advances in pedagogical theory, educational technology and the influence of Covid-19. Our aim was to explore how Australian HPE Educators modified tertiary assessments in response to these factors.
Assessment tasks in Health Professional Education (HPE) require regular renewal to stay contemporary in the constantly evolving educational landscape. In particular, the last 5 years have seen notable advances in pedagogical theory, educational technology and the influence of Covid-19. Our aim was to explore how Australian HPE Educators modified tertiary assessments in response to these factors.
Summary of work
We reviewed all Australian postgraduate HPE teaching qualifications from 2018 onwards (these were considered to be exemplar courses that would influence downstream HPE delivery). Assessments for each subject were compared between 2018 and 2023, differences quantified, pedagogical drivers for change identified, and trends in practice analysed.
Results
We identified 11 relevant HPE teaching qualifications. Assessment outlines were sourced for 77 subjects with a combined total of 421 assessments (204 from 2018 and 217 from 2023). In this time, 2 subjects were discontinued and 9 introduced. 55.9% of subjects in 2018 were delivered online, compared to 78.7% of subjects in 2023.
66 subjects had details from both 2018 and 2023. Of these, 48.5% (32/66) had their assessments modified. 30.3% (20/66) were modified in a major fashion (>20% change in assessments). 14 subjects had a change relating to online assessment. 6 subjects demonstrated a clear change in pedagogical assessment philosophy (eg. introducing reflective assessment tasks; one subject removed a disproved learning theory) and 3 subjects underwent complete renewal.
Discussion and Conclusion
Significant changes in assessment practice occurred in Australian HPE teaching courses between 2018 and 2023 with approximately half of all subjects undergoing assessment changes. Significant drivers of assessment change included adaptations to online assessment and evolutions in assessment philosophy.
Take Home Messages
The HPE teaching community implemented widespread assessment changes in line with evolving delivery contexts and pedagogical philosophies.
Further studies to demonstrate resulting downstream effects would be beneficial.
References (maximum three)
Cox, M., Cox, M., & Irby, D. M. (2007). Assessment in Medical Education. The New England Journal of Medicine, 356(4), 387–396.
Eraky, M. A. (2021). Evolving trends for assessment in the new norm of medical education. South‐East Asian Journal of Medical Education, 15, 3.
Harris, P., Bhanji, F., Topps, M., Ross, S., Lieberman, S. A., Jason R. Frank, Frank, J. R., Linda Snell, Snell, L., & Sherbino, J. (2017). Evolving concepts of assessment in a competency-based world. Medical Teacher, 39(6), 603–608.