Presentation Description
Briseida Mema1
Dominique Piquette
1 Critical Care Medicine Department, Hospital for Sick Children
Dominique Piquette
1 Critical Care Medicine Department, Hospital for Sick Children
Background:
Disentangling formative and summative assessments in CBME remains an important issue. Simulation may help as a low- stakes learning environment, affording practice and experimentation, but it is not clear how assessment in simulation it is perceived by trainees.
Disentangling formative and summative assessments in CBME remains an important issue. Simulation may help as a low- stakes learning environment, affording practice and experimentation, but it is not clear how assessment in simulation it is perceived by trainees.
Summary:
The goal of this study was to explore trainees' perception of their Virtual reality (VR) bronchoscopy simulation assessments in an outcome-based model of training. Specifically, we aimed to examine what assessments learners select to document and receive feedback on and what influences their decision.
The goal of this study was to explore trainees' perception of their Virtual reality (VR) bronchoscopy simulation assessments in an outcome-based model of training. Specifically, we aimed to examine what assessments learners select to document and receive feedback on and what influences their decision.
We used a sequential explanatory mixed methods strategy. During independent simulation practice, we collected the number of attempts that were learning-focused practice (scores not recorded) and assessment-focused practice (scores recorded and reviewed by the instructor for the purpose of feedback), and the timeeach attempt lasted. At the end of simulation training, we conducted interviews to explore learners’ perceptions on assessment.
Results:
Twenty learners participated in the study. There was no significant difference in the number of attempts for each practice type. The average time per each learning – focused attempt was almost triple longer than assessment- focused, mean (SD) (16±1 min) vs. (6±3 min) respectively, p-value <0.05. Learners perceived the documentation of their scores as high- stakes and only recorded their better scores. Their perceptions were influenced by individual characteristics, contextual factors, and score representation.
Twenty learners participated in the study. There was no significant difference in the number of attempts for each practice type. The average time per each learning – focused attempt was almost triple longer than assessment- focused, mean (SD) (16±1 min) vs. (6±3 min) respectively, p-value <0.05. Learners perceived the documentation of their scores as high- stakes and only recorded their better scores. Their perceptions were influenced by individual characteristics, contextual factors, and score representation.
Discussion:
In the context of an outcome-based VR simulation training, learners use the assessments to mark their progression; however, automatic feedback was not more informative than supervisor feedback, learners felt safer experimenting only if their assessments were not recorded.
In the context of an outcome-based VR simulation training, learners use the assessments to mark their progression; however, automatic feedback was not more informative than supervisor feedback, learners felt safer experimenting only if their assessments were not recorded.
Conclusion:
Factors such as culture of medicine affect the views on simulation-based as well asclinical-basedassessments. Thefindingshaveimportanceforeducatorsdesigningoutcome- based simulation programs. With CBME changing culture of assessment we need to examine if attitudes of learners will change.
Factors such as culture of medicine affect the views on simulation-based as well asclinical-basedassessments. Thefindingshaveimportanceforeducatorsdesigningoutcome- based simulation programs. With CBME changing culture of assessment we need to examine if attitudes of learners will change.
References (maximum three)
1.Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A Core Components Framework for Evaluating Implementation of Competency-Based Medical Education Programs. Acad Med. 2019 Jul;94(7):1002-1009.
2.Watling CJ, Ginsburg S. Assessment, feedback and the alchemy of learning. Med Educ. 2019 Jan;53(1):76-85.
3.LaDonna KA, Hatala R, Lingard L, Voyer S, Watling C. Staging a performance: Learners’perceptions about direct observation duringresidency. Med Educ. 2017;51:498–510.