Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

A case example of learning analytics data for program evaluation: Identifying gaps in teaching about gender differences in chest pain

Oral Presentation
Edit Your Submission
Edit

Presentation Description

Zoe Brody
Shelley Ross1
1 University of Alberta



Background:
Learning analytics, including performance and behavioural metrics, are used to understand and improve learning and learning environments. In our program, we use FieldNotes (documentation of feedback in workplace-based teaching) as part of programmatic assessment of learners. FieldNotes can de-identified and used for learning analytics. Recent concerns in the literature about health inequities for female patients with cardiovascular issues prompted us to consider ways that FieldNotes learning analytics data could be used to explore specific program evaluation questions. In this study, we used these learning analytics data to examine clinical teaching about gender differences in chest pain presentation. 


Summary of work:
We used secondary data analysis of 12 years (July 2011 - June 2023) of archived FieldNotes learning analytics data. FieldNotes include narrative feedback to learners and descriptions of patient presentations, thus serving as a proxy for clinical teaching. FieldNotes about chest pain (search terms: “chest pain”, “heart”, “MI”) were searched for the term ‘atypical’, and then the sex of the patient was determined. Chi-square goodness of fit to tested the assumption that the proportion of ‘atypical’ classification was equal across sexes. 


Results
The database (N = 64,942) search identified 677 (1.04%) FieldNotes about chest pain. 76 (11.2%) described symptoms as ‘atypical’. Female patients’ chest pain symptoms were described as ‘atypical’ significantly more frequently (X2 = 18.24; df = 2; p = 0.00011). 

Discussion:
Our findings identified a teaching gap in our program around women’s health. Having objective evidence of this gap can allow for targeted faculty development. Conclusions: Our study demonstrates the value of learning analytics for program evaluation and examination of quality-of-care issues. These data can also contribute to practice quality improvement. 


Take-home messages/implications for further research:
Learning analytics data used for program evaluation can identify curriculum and teaching gaps. 



References (maximum three) 

1. Lee JR, Ross S. A comparison of resident-completed and preceptor-completed formative workplace-based assessments in a competency-based medical education program. Fam Med. 2022;54(8):599-605. 

2. Ross S, Lawrence K, Bethune C, van der Goes T, Pélissier-Simard L, Donoff M, Crichton T, Laughlin T, Dhillon K, Potter M, Schultz K. Development, implementation, and meta-evaluation of a national approach to programmatic assessment in family medicine residency training. Academic Medicine 2023; 98(2):188-198 

3. Ross S, Poth C, Donoff M, Humphries P, Steiner I, Schipper S, Janke F, Nichols D. The Competency-Based Achievement System (CBAS): Using formative feedback to teach
 and assess competencies with Family Medicine residents. Canadian Family Physician. 2011;57:e323-e330. 

Speakers