Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Virtual tools for assessment and learning
Oral Presentation
Oral Presentation
2:00 pm
27 February 2024
M212
Session Program
2:00 pm
Malka Stromer, MEd, BSc, RDMS, CRGS1 Maren Nelson1
1 WCUI | Global
1 WCUI | Global
Point-of-Care Ultrasound (POCUS) education requires assessment of competency through skills demonstration, including image review by trained professionals.
There is a massive educational discrepancy amongst ultrasound users worldwide. In North America, and many other countries, including Australia and New Zealand, a sonographer or ultrasound technologist is a healthcare professional tasked with acquiring diagnostic ultrasound images that are sent to radiologists for diagnostic interpretation.
This has historically limited other healthcare practitioners from utilizing POCUS. Presently, many other specialties are seeing the value in diagnostic ultrasound and since ultrasound technology is now easily accessible and inexpensive, individuals can purchase equipment, acquire images, and make diagnostic interpretations of these images with zero educational requirements!
Many medical schools have incorporated aspects of POCUS into their curriculum, but there are limitations in lack of trained faculty and supporting resources and on curricular changes.
We built an interdisciplinary team with expertise in online learning design and ultrasound education, who developed an innovative curriculum, grounded in the learning theory of Communities of Practice, to create scalable apprenticeship models and sustainable mentorship opportunities in the on-going assessment of scanning skills at various learning levels.
Our education model features stackable, scaffolded, fully online, asynchronous coursework that culminates in live scanning either in clinic, in person, and/or through virtual collaboration platforms, such as Zoom. As a culminating step, our summative assessment is conducted using our proprietary web-application, POCUS PRO, which allows learners to upload scans of various POCUS protocols for expert review and feedback. This level of assessment ensures hands-on skills are demonstrable and encourages ongoing scan review as the learner solidifies their scanning skills by incorporating POCUS into their clinical practice.
Currently there are no certification requirements, nor educational requirements for certification for POCUS use in medical settings. Our educational model positively impacts patient care, by creating a pathway to competency that ensures skills acquisition is uniformly demonstrated by learners. We are actively advocating for requirements that include this type of education and on-going scan review, to demonstrate skills competency prior to both implementation in a clinical environment and certification.
Technology allows us to augment curricula with limited changes in overall program structure. By incorporating this technology as a resource, we can create sustained models for learning and work towards competent use in the clinical setting through on-going mentorship and feedback into residency/internship.
Our presentation will include data from a recent pilot project, implementing the first phases of our curriculum model at a large U.S.-based medical school. Through this project, we assess the impact of online learning resources on first and second-year medical students prior to their Clinical Skills Labs which instruct on POCUS use. We anticipate showing that our technological innovations enhance student confidence in their ability to participate in labs, perform POCUS scans and utilize these skills throughout the remainder of their medical school journey. Additionally, our presentation will include interactive components to bring attendees through a recreation of the learner journey!
References (maximum three)
https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-018-1437-2 : Building a bigger tent in point-of-care ultrasound education: a mixed-methods evaluation of interprofessional, near-peer teaching of internal medicine residents by sonography students 2018.
https://theultrasoundjournal.springeropen.com/articles/10.1186/s13089-021-00214-w ; Development and implementation of a point of care ultrasound curriculum at a multi-site institution.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9028775/ : Competency Assessment of POCUS in Emergency Medicine.
2:15 pm
Vikki O'Neill1
James Scholar1, Kathy Cullen1 and S. Helen Oram2
1 Queen's University Belfast
2 King's College London
James Scholar1, Kathy Cullen1 and S. Helen Oram2
1 Queen's University Belfast
2 King's College London
Background
An advantage to computer delivered assessment, is the ability to gather real-time information about candidate interaction with individual questions and assessments as a whole (“clickstream” data) [1]. Selection of items, question order navigation, view time, bookmarking and repeated viewing, may all be interrogated.
Understanding student exam strategies, also known as “testwiseness”, is vital, as whilst exams aim to quantify knowledge, factors such as test anxiety, motivation, and strategic approaches will also impact test outcomes. Testwiseness will not only effect scores, but also play a secondary role in reducing anxiety and sustaining motivation [2,3].
Summary of work
Clickstream data from an online invigilated Single Best Answer (SBA) Progress Test, given to medical students in their first 3 years (870 students, 100 items) was analysed. Employing classification techniques in R, six groups were formed based on students structural strategies, such as question skipping and reviewing.
Results
Results indicate similar levels of strategies across the year groups, with the most popular strategy being to review all questions once (39.85% Y1, 41.97% Y2, 42.37% Y3). Reviewing all 100 questions as a second pass improves a score by 0.31% (p=0.0516). Students’ scores were found to be significantly different across the strategies for year 3 only (p=0.0014). A small proportion of students were found to have a true non-sequential approach, apparently selecting questions at random, which bears further study.
Discussion
Distinct reviewing strategies were noted, with no variation in proportions by year group. Reviewing questions at least once each was of some benefit to students. These strategies provide insights into how we might best support borderline students and those entitled to additional exam time.
Conclusions/Take-home messages
Clickstream data offers an opportunity to better understand students’ testwiseness. Exam strategies benefit students, and consequently we should consider teaching students how to utilise these to their advantage.
References (maximum three)
1. McManus, I.C., Chis, L., Ferro, A., Oram, S.H., Galloway, J., O'Neill, V., Myers, G. and Sturrock, A., (2023). “Visualising candidate behaviour in computer-based testing: Using ClickMaps for exploring ClickStreams in undergraduate and postgraduate medical examinations”. medRxiv, pp.2023-06.
2. Mahamed, A., Gregory, P. A., & Austin, Z. (2006). "Testwiseness" among international pharmacy graduates and Canadian senior pharmacy students. American journal of pharmaceutical education, 70(6), 131. https://doi.org/10.5688/aj7006131
3. Russo, James A. (2019) "The impact of a short test-wiseness intervention on standardised numeracy assessment scores: A cautionary tale about using NAPLAN growth data to evaluate primary schools," Networks: An Online Journal for Teacher Research: Vol. 21: Iss. 2. https://doi.org/10.4148/2470-6353.1301
2:30 pm
Briseida Mema1
Dominique Piquette
1 Critical Care Medicine Department, Hospital for Sick Children
Dominique Piquette
1 Critical Care Medicine Department, Hospital for Sick Children
Background:
Disentangling formative and summative assessments in CBME remains an important issue. Simulation may help as a low- stakes learning environment, affording practice and experimentation, but it is not clear how assessment in simulation it is perceived by trainees.
Disentangling formative and summative assessments in CBME remains an important issue. Simulation may help as a low- stakes learning environment, affording practice and experimentation, but it is not clear how assessment in simulation it is perceived by trainees.
Summary:
The goal of this study was to explore trainees' perception of their Virtual reality (VR) bronchoscopy simulation assessments in an outcome-based model of training. Specifically, we aimed to examine what assessments learners select to document and receive feedback on and what influences their decision.
The goal of this study was to explore trainees' perception of their Virtual reality (VR) bronchoscopy simulation assessments in an outcome-based model of training. Specifically, we aimed to examine what assessments learners select to document and receive feedback on and what influences their decision.
We used a sequential explanatory mixed methods strategy. During independent simulation practice, we collected the number of attempts that were learning-focused practice (scores not recorded) and assessment-focused practice (scores recorded and reviewed by the instructor for the purpose of feedback), and the timeeach attempt lasted. At the end of simulation training, we conducted interviews to explore learners’ perceptions on assessment.
Results:
Twenty learners participated in the study. There was no significant difference in the number of attempts for each practice type. The average time per each learning – focused attempt was almost triple longer than assessment- focused, mean (SD) (16±1 min) vs. (6±3 min) respectively, p-value <0.05. Learners perceived the documentation of their scores as high- stakes and only recorded their better scores. Their perceptions were influenced by individual characteristics, contextual factors, and score representation.
Twenty learners participated in the study. There was no significant difference in the number of attempts for each practice type. The average time per each learning – focused attempt was almost triple longer than assessment- focused, mean (SD) (16±1 min) vs. (6±3 min) respectively, p-value <0.05. Learners perceived the documentation of their scores as high- stakes and only recorded their better scores. Their perceptions were influenced by individual characteristics, contextual factors, and score representation.
Discussion:
In the context of an outcome-based VR simulation training, learners use the assessments to mark their progression; however, automatic feedback was not more informative than supervisor feedback, learners felt safer experimenting only if their assessments were not recorded.
In the context of an outcome-based VR simulation training, learners use the assessments to mark their progression; however, automatic feedback was not more informative than supervisor feedback, learners felt safer experimenting only if their assessments were not recorded.
Conclusion:
Factors such as culture of medicine affect the views on simulation-based as well asclinical-basedassessments. Thefindingshaveimportanceforeducatorsdesigningoutcome- based simulation programs. With CBME changing culture of assessment we need to examine if attitudes of learners will change.
Factors such as culture of medicine affect the views on simulation-based as well asclinical-basedassessments. Thefindingshaveimportanceforeducatorsdesigningoutcome- based simulation programs. With CBME changing culture of assessment we need to examine if attitudes of learners will change.
References (maximum three)
1.Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs. Acad Med. 2019 Jul;94(7):1002-1009.
2.Watling CJ, Ginsburg S. Assessment, feedback and the alchemy of learning. Med Educ. 2019 Jan;53(1):76-85.
3.LaDonna KA, Hatala R, Lingard L, Voyer S, Watling C. Staging a performance: Learners’perceptions about direct observation duringresidency. Med Educ. 2017;51:498–510.
2:45 pm
Stephanie Moore-Lotridge
Andrew Rees, Samuel Johnson, Leigh Campbell, Jacob Schultz, Jewsin Raj, Ole Molvig, Bryan Tompkins, Nicholas Fletcher, Christopher Schoenecker and Jonathan Schoenecker
Andrew Rees, Samuel Johnson, Leigh Campbell, Jacob Schultz, Jewsin Raj, Ole Molvig, Bryan Tompkins, Nicholas Fletcher, Christopher Schoenecker and Jonathan Schoenecker
Background:
Didactic lectures continue to be an essential compliment to hands-on clinical experiences in a post-graduate medical education curriculum. Cognitive psychology experiments suggest that introduction of information, followed by repeated “retrieval” through active recall is the best way to generate long-term retention. The objective of this study was to build an app-based platform that could be implemented in conjunction with resident didactic lectures to 1) promote active recall and 2) provide individualized assessment of the material in real time.
Didactic lectures continue to be an essential compliment to hands-on clinical experiences in a post-graduate medical education curriculum. Cognitive psychology experiments suggest that introduction of information, followed by repeated “retrieval” through active recall is the best way to generate long-term retention. The objective of this study was to build an app-based platform that could be implemented in conjunction with resident didactic lectures to 1) promote active recall and 2) provide individualized assessment of the material in real time.
Summary of Work:
An interactive, retrieval-based smart phone learning application was developed to supplement didactic lectures being presented to orthopaedic residents. The application employed gamified quiz and survey-based functions that prompted participation from learners. Both the presenter and residents received instantaneous feedback about their performance or survey responses through the application.
An interactive, retrieval-based smart phone learning application was developed to supplement didactic lectures being presented to orthopaedic residents. The application employed gamified quiz and survey-based functions that prompted participation from learners. Both the presenter and residents received instantaneous feedback about their performance or survey responses through the application.
Results:
Participating residents (N=18) reported that standard didactic lectures had variably effectiveness, while lectures given in conjunction with interactive application were rated as much more effective than standard lectures (88.9%- much more effective; 11.1%- somewhat more effective). Importantly, all residents found the application easy to use (83.3% strongly agreed; 16.7% somewhat agreed), and noted that they would like to incorporate such a tool when they give future lectures (55.6% very likely to use; 44.4% somewhat likely to use). Presenters (N=3) commented that they found the application “helpful for guiding discussion” and “identification of areas of deficiency in the audience” aligning with the real-time assessment capacity.
Participating residents (N=18) reported that standard didactic lectures had variably effectiveness, while lectures given in conjunction with interactive application were rated as much more effective than standard lectures (88.9%- much more effective; 11.1%- somewhat more effective). Importantly, all residents found the application easy to use (83.3% strongly agreed; 16.7% somewhat agreed), and noted that they would like to incorporate such a tool when they give future lectures (55.6% very likely to use; 44.4% somewhat likely to use). Presenters (N=3) commented that they found the application “helpful for guiding discussion” and “identification of areas of deficiency in the audience” aligning with the real-time assessment capacity.
Conclusion & Take home message:
A gamified, interactive, learning application has the potential to improve individual learning experiences and participation during resident didact lectures. Incorporating technology in residence lectures was both feasible and impactful to the learner’s experiences. Based on these positive results, our team aims to expand the use of this smart phone application to additional residency lectures, in aims of providing longitudinal assessment.
A gamified, interactive, learning application has the potential to improve individual learning experiences and participation during resident didact lectures. Incorporating technology in residence lectures was both feasible and impactful to the learner’s experiences. Based on these positive results, our team aims to expand the use of this smart phone application to additional residency lectures, in aims of providing longitudinal assessment.
References (maximum three)
Roediger HL, Butler AC. The critical role of retrieval practice in long-term retention. Trends in cognitive sciences. 2011 Jan 1;15(1):20-7.
Zakrajsek T, Newton W. Promoting active learning in residency didactic sessions. Family Medicine. 2021;53(7):608-10.