Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Assessment of clinical skills
Oral Presentation
Oral Presentation
10:00 am
28 February 2024
M211
Session Program
10:00 am
John Gimpel1
Richard Labaere2,1, Jeanne Sandella1 and John Boulet1
1 NBOME
2 ATSU
Richard Labaere2,1, Jeanne Sandella1 and John Boulet1
1 NBOME
2 ATSU
After the pandemic-associated discontinuation of the Level-2-PE clinical skills component of the COMLEX-USA national licensure exam in 2020-2021, a Special Commission recommended research on a school-based national standardized assessment of core osteopathic clinical skills, including communication skills, medical interviewing, and physical examination. The NBOME committed resources to developing a core competency capstone for osteopathic medical students (C3DO).
The C3DO pilot evaluated the feasibility of administering a centrally developed multi-station OSCE at osteopathic colleges. Senior medical students from 4 schools completed the 8-station OSCE exam prototype. The prototype incorporated nationally standardized case content and extensive standardized patient and physician examiner training. The analysis cohort included 811 senior medical students. Descriptive statistics were used to compare performances both within and across schools. The reliability of the component and total scores was estimated using generalizability theory. Evidence to support the construct validity of the scores was gathered by quantifying the relationships between component scores (Pearson r) and summarizing student survey responses regarding the adequacy and comprehensiveness of the assessment.
Preliminary results revealed some variability in performance across medical schools and attributable to the choice of evaluator. The reliability of the patient-physician communication summary score was ρ2=0.80. The ability to gather data (history taking, physical examination) was moderately correlated with communication ability (r=0.38). The summary of student survey responses indicated that most students felt that the assessment was fair and measured the relevant competencies.
The results from the initial pilot studies suggest that administering a school-based clinical skills assessment is feasible. While some evidence was gathered to support the psychometric adequacy of the scores, further studies are needed to buttress the validity argument. The NBOME, as a part of strategy to build a standardized, school-based, clinical skills assessment, will continue to collaborate with colleges of osteopathic medicine to adapt and refine the C3DO prototype.
References (maximum three)
National Board of Osteopathic Medical Examiners. COMLEX-USA · Level 2-PE — NBOME. Retrieved November 30, 2022, from https://www.nbome.org/assessments/comlex-usa/comlex- usa-level-2-pe/
National Board of Osteopathic Medical Examiners. The Special Commission on Osteopathic Medical Licensure Assessment Final Report — NBOME. Retrieved January 12, 2023, from https://www.nbome.org/special-commission-on-osteopathic-medical-licensure- assessment/final-report/
10:15 am
Meghan McConnell
Samantha Halman1 and Debra Pugh
1 University of Ottawa
Samantha Halman1 and Debra Pugh
1 University of Ottawa
Background:
Point-of-care ultrasound (PoCUS) has enhanced physicians’ ability to rapidly diagnosis and treat patients. As the role of PoCUS in the assessment and management of patients continues to evolve, it is imperative that medical educators develop ways to ensure that learners are safely acquiring these skills. The objective of the present study was to determine how the use of performance-based testing influenced the learning of complex motor and cognitive skills associated with ultrasound image generation and interpretation, when compared with deliberate practice.
Point-of-care ultrasound (PoCUS) has enhanced physicians’ ability to rapidly diagnosis and treat patients. As the role of PoCUS in the assessment and management of patients continues to evolve, it is imperative that medical educators develop ways to ensure that learners are safely acquiring these skills. The objective of the present study was to determine how the use of performance-based testing influenced the learning of complex motor and cognitive skills associated with ultrasound image generation and interpretation, when compared with deliberate practice.
Summary of Work:
In Phase I, participants attended a didactic learning session on PoCUS and received hands-on training for six different POCUS skills. Phase II took place one month later, where participants were able to practice the skills they learned during Phase I. For each skill, half of the participants received retrieval practice (testing condition), whereas the other half received guided instruction (studying/control condition). Phase III took place three months later. In this final phase, participants completed a PoCUS OSCE, which consisted of six stations, each one aligning to a previously studied PoCUS skill. Sonographers were present during the OSCE, and evaluated each resident’s performance in real time.
In Phase I, participants attended a didactic learning session on PoCUS and received hands-on training for six different POCUS skills. Phase II took place one month later, where participants were able to practice the skills they learned during Phase I. For each skill, half of the participants received retrieval practice (testing condition), whereas the other half received guided instruction (studying/control condition). Phase III took place three months later. In this final phase, participants completed a PoCUS OSCE, which consisted of six stations, each one aligning to a previously studied PoCUS skill. Sonographers were present during the OSCE, and evaluated each resident’s performance in real time.
Summary of Results:
Overall, 19 first year internal medicine residents participated in the study. There was no significant difference in PoCUS performance between the testing (mean z-score = -0.01) and control conditions, (M=0.01; t=.02, p =.983).
Overall, 19 first year internal medicine residents participated in the study. There was no significant difference in PoCUS performance between the testing (mean z-score = -0.01) and control conditions, (M=0.01; t=.02, p =.983).
Discussion:
TEL may not apply to the acquisition of procedural skills in medical education. We consider the limitations of the current study that make conclusions difficult at this time, and discuss future directions in studying how TEL influences procedural skill acquisition.
TEL may not apply to the acquisition of procedural skills in medical education. We consider the limitations of the current study that make conclusions difficult at this time, and discuss future directions in studying how TEL influences procedural skill acquisition.
Take-home Message:
It remains unclear whether TEL improves the acquisition of complex motor and cognitive skills, such as those required for PoCUS.
It remains unclear whether TEL improves the acquisition of complex motor and cognitive skills, such as those required for PoCUS.
References (maximum three)
Roediger H, Karpicke J. Test-enhanced learning: Taking memory tests improves long-term retention. Psychol Sci. 2006;17:249–55.
Slomer A, Chenkin J. Does test‐enhanced learning improve success rates of ultrasound‐guided peripheral intravenous insertion? A randomized controlled trial. AEM Educ Train. 2017;1(4):310–5.
Baghdady M, Carnahan H, Lam EWN, Woods NN. Test-enhanced learning and its effect on comprehension and diagnostic accuracy. Med Educ. 2014;48:181–8.
10:30 am
John Egbuji1
Megan Anakin2
1 University of Otago Medical School
2 University of Otago
Megan Anakin2
1 University of Otago Medical School
2 University of Otago
Our medical school made urgent changes to how we assessed clinical skills because we were unable to conduct directly observed performances during the pandemic. We sought to mitigate the impact of this sudden change on student learning and progression decision-making. We provided students with additional learning and feedback opportunities, documenting them later in their medical programme. This situation, however, gave rise to a question about the interchangeability of assessment formats and learning opportunities in the pre-clinical years of the medical degree.
An interpretive perspective framed this reflective exercise. We explored the idea of interchangeability by consulting the medical education assessment literature and colleagues with expertise in psychometrics and assessment. We analysed the functions of assessment described in the literature and recommendations gathered from colleagues to identify possible wanted and unwanted consequences on resources, outcomes, and student engagement.
We constructed a rationale to support the idea that assessment formats and learning opportunities could be interchanged when resources are constrained, outcomes are long-term, and to account for the variable ways students engage with assessment and learning. This rationale is compatible with the principles of programmatic assessment and is tailored to our local context
This type of reflective exercise may be useful to others faced with dilemmas, unexpected events, and renewal challenges to their assessment programmes. The rationale we constructed may be a productive starting point for assessment coordinators and leaders to have conversations with colleagues and students about changes to assessment practices at their institutions.
Discussions about interchangeability can be a productive way to address challenges to assessment practices. Assessment events can be visualised as learning opportunities and distributed across the duration of a medical programme.
A next step will be to examine the impact of this rationale on changes to assessment resource use, outcomes, and student engagement at our medical school.