Times are shown in your local time zone GMT
Ad-blocker Detected - Your browser has an ad-blocker enabled, please disable it to ensure your attendance is not impacted, such as CPD tracking (if relevant). For technical help, contact Support.
Programmatic assessment and integrative approaches
Oral Presentation
Oral Presentation
10:00 am
28 February 2024
M212
Themes
Theme 8: Evaluation
Session Program
10:00 am
Holly Caretta-Weyer1
1 Stanford University School of Medicine
1 Stanford University School of Medicine
Background:
CBME has emerged as the future of health professions education across the continuum. However, significant barriers exist particularly around programmatic assessment, including the challenges of implementing data-driven summative entrustment decision-making processes. Clinical competency committees (CCCs) are designated to review trainee assessment data and formulate recommendations for progression. However, the data input, group decision-making processes, and output of CCCs remain highly variable.
CBME has emerged as the future of health professions education across the continuum. However, significant barriers exist particularly around programmatic assessment, including the challenges of implementing data-driven summative entrustment decision-making processes. Clinical competency committees (CCCs) are designated to review trainee assessment data and formulate recommendations for progression. However, the data input, group decision-making processes, and output of CCCs remain highly variable.
Summary of Work:
To better understand how CCCs utilize data and formulate summative entrustment decisions, we performed a contribution analysis of current CCC processes and assessment data usage, group decision-making processes, and analysis of the ideal future state such that CCCs may defensibly derive summative entrustment decisions that inform graduation and initial certification decisions.
To better understand how CCCs utilize data and formulate summative entrustment decisions, we performed a contribution analysis of current CCC processes and assessment data usage, group decision-making processes, and analysis of the ideal future state such that CCCs may defensibly derive summative entrustment decisions that inform graduation and initial certification decisions.
Results:
Our contribution analysis created an impact pathway that identified assumptions and risks with CCC outputs, proximal outcomes, and distal outcomes in the assessment data usage and summative entrustment decision-making processes. By observing CCCs, we have learned how different groups grapple with assessment data and make decisions about resident progression. We have additionally explored what assessment gaps must be filled and data visualization required to better support the work of CCCs in making summative entrustment decisions to mitigate some of the risks and assumptions identified.
Our contribution analysis created an impact pathway that identified assumptions and risks with CCC outputs, proximal outcomes, and distal outcomes in the assessment data usage and summative entrustment decision-making processes. By observing CCCs, we have learned how different groups grapple with assessment data and make decisions about resident progression. We have additionally explored what assessment gaps must be filled and data visualization required to better support the work of CCCs in making summative entrustment decisions to mitigate some of the risks and assumptions identified.
Discussion/Conclusions:
By performing a contribution analysis, it is clear that aligning CCC decision-making with assessment data fit for purpose, desired outcomes, learner preparedness, and patient/societal needs is a key next step to mitigate risks in the impact pathway. Only then can the principles of CBME truly be achieved.
By performing a contribution analysis, it is clear that aligning CCC decision-making with assessment data fit for purpose, desired outcomes, learner preparedness, and patient/societal needs is a key next step to mitigate risks in the impact pathway. Only then can the principles of CBME truly be achieved.
Implications:
We intend to elaborate upon the assessment gaps identified in order to further develop programmatic assessment, mitigate the risks identified in the contribution analysis, and continually reevaluate CCC group decision-making using a realist lens to determine how to optimize the process and ensure robust summative entrustment decision-making within competency-based systems.
We intend to elaborate upon the assessment gaps identified in order to further develop programmatic assessment, mitigate the risks identified in the contribution analysis, and continually reevaluate CCC group decision-making using a realist lens to determine how to optimize the process and ensure robust summative entrustment decision-making within competency-based systems.
References (maximum three)
1. Van Melle E, Gruppen L, Holmboe ES, Flynn L, Oandasan I, Frank JR, et al. Using contribution analysis to evaluate competency-based medical education programs: it’s all about rigor in thinking. Acad Med. June 2017;92(6):752-8.
10:15 am
Janica Jamieson1,2
Claire Palermo2, Margaret Hay2, Rachel Bacon3, Janna Lutze4 and Simone Gibson2
1 Edith Cowan University
2 Monash University
3 University of Canberra
4 Discipline of Nutrition & Dietetics, University of Wollongong
Claire Palermo2, Margaret Hay2, Rachel Bacon3, Janna Lutze4 and Simone Gibson2
1 Edith Cowan University
2 Monash University
3 University of Canberra
4 Discipline of Nutrition & Dietetics, University of Wollongong
Background:
Programmatic assessment is an increasingly popular, yet complex education initiative with implementation challenged by contextual parameters (Torre et al., 2021), necessitating robust evaluation to support transference of theory to practice (Haji et al., 2013). Contribution analysis (CA), a theory-based evaluation framework, determines the contribution an intervention makes to outcomes. CA enables evaluation of complex interventions in dynamic-authentic settings (Mayne 2021), making it suited to programmatic assessment. We applied the six steps of CA to evaluate programmatic assessment. Summary of work: (1) Cause- effect questions (2) and a theory of change (ToC) were developed. (3) A qualitative study with programmatic assessment stakeholders (faculty n=19, graduates n=15, supervisors n=32) from four Australian dietetic programs provided evaluation data. These data were (4) assembled into contribution claims and story. (5) Additional data were gathered from the same stakeholders and literature to (6) finalise the ToC and contribution story. Results: Leaders initiated and drove development of programmatic assessment by a design team who applied the principles as guides and used compromise to navigate challenges, leading to a contextually responsive programmatic assessment. All users needed training, with fit-for-purpose tools implemented within an ideologically aligned assessment system. Students became leaders, supervisors teachers, and faculty facilitators, working collaboratively as a learning team with growth mindset. An assessment team used congruency of collated low-stakes data to inform high- stakes decisions. Causal pathways coalesced to create collaborative-individualised learning environments, psychologically safe remediation, enabled credible high-stakes decisions, and prepared graduates for practice. Ultimately, people experienced less stress and care recipients benefited. Discussion: Successful programmatic assessment requires leaders to bring together capable people to enact role responsibilities as intended. Conclusions: CA revealed important causal links underpinning, and leading to, programmatic assessment outcomes. Implications: Leverage and risk points are illuminated for implementors to facilitate individualised and successful manifestations of programmatic assessment across diverse settings.
Programmatic assessment is an increasingly popular, yet complex education initiative with implementation challenged by contextual parameters (Torre et al., 2021), necessitating robust evaluation to support transference of theory to practice (Haji et al., 2013). Contribution analysis (CA), a theory-based evaluation framework, determines the contribution an intervention makes to outcomes. CA enables evaluation of complex interventions in dynamic-authentic settings (Mayne 2021), making it suited to programmatic assessment. We applied the six steps of CA to evaluate programmatic assessment. Summary of work: (1) Cause- effect questions (2) and a theory of change (ToC) were developed. (3) A qualitative study with programmatic assessment stakeholders (faculty n=19, graduates n=15, supervisors n=32) from four Australian dietetic programs provided evaluation data. These data were (4) assembled into contribution claims and story. (5) Additional data were gathered from the same stakeholders and literature to (6) finalise the ToC and contribution story. Results: Leaders initiated and drove development of programmatic assessment by a design team who applied the principles as guides and used compromise to navigate challenges, leading to a contextually responsive programmatic assessment. All users needed training, with fit-for-purpose tools implemented within an ideologically aligned assessment system. Students became leaders, supervisors teachers, and faculty facilitators, working collaboratively as a learning team with growth mindset. An assessment team used congruency of collated low-stakes data to inform high- stakes decisions. Causal pathways coalesced to create collaborative-individualised learning environments, psychologically safe remediation, enabled credible high-stakes decisions, and prepared graduates for practice. Ultimately, people experienced less stress and care recipients benefited. Discussion: Successful programmatic assessment requires leaders to bring together capable people to enact role responsibilities as intended. Conclusions: CA revealed important causal links underpinning, and leading to, programmatic assessment outcomes. Implications: Leverage and risk points are illuminated for implementors to facilitate individualised and successful manifestations of programmatic assessment across diverse settings.
References (maximum three)
Haji, F., Morin, M-P., & Parker, K. (2013). Rethinking programme evaluation in health professions education: beyond ‘did it work?’ Medical Education, 47(4), 342-351.
Mayne, J. (2012). Contribution analysis: coming of age? Evaluation, 18(3), 270-280.
Torre et al., D., Rice, N. E., Ryan, A., Bok, H., Dawson, L. J., Bierer, B., Wilkinson, W. J., Tait, G. R., Laughling, T., Veerpen, K., Heeneman, S., Freeman, A., & van der Vleuten, C. (2021). Ottawa 2020 consensus statements for programmatic assessment 2: implementation and practice, Medical Teacher, 43(10), 1149-1160.
10:30 am
Jinlong Gao1
Delyse Leadbeatter1
1 University of Sydney
Delyse Leadbeatter1
1 University of Sydney
Background:
The delivery of person-centred healthcare is a key graduate requirement for health professional employability. However, ways to teach students to practice person-centred healthcare are less established. Liberal arts pedagogy is increasingly recognised as an approach in contemporary health professional curricula to enable students to embody person-centred healthcare.
The delivery of person-centred healthcare is a key graduate requirement for health professional employability. However, ways to teach students to practice person-centred healthcare are less established. Liberal arts pedagogy is increasingly recognised as an approach in contemporary health professional curricula to enable students to embody person-centred healthcare.
Summary of work:
We drew upon pedagogical features of liberal arts to design the programmatic assessment in a contemporary dentistry curriculum. We used Lewis’ (2018) work on liberal arts and medical education and our own work on student engagement and employability in health professional education to inform and evaluate the design.
We drew upon pedagogical features of liberal arts to design the programmatic assessment in a contemporary dentistry curriculum. We used Lewis’ (2018) work on liberal arts and medical education and our own work on student engagement and employability in health professional education to inform and evaluate the design.
Results:
Pedagogical features drawn from liberal arts, such as peer effect, artworks, artefacts and narratives, will be described. An example of using faculty narrative reflections on a nominated art piece to feature course content to facilitate connection between students, faculty and the course will be introduced and analysed.
Pedagogical features drawn from liberal arts, such as peer effect, artworks, artefacts and narratives, will be described. An example of using faculty narrative reflections on a nominated art piece to feature course content to facilitate connection between students, faculty and the course will be introduced and analysed.
Discussion:
The value of using liberal arts pedagogy to teach person-centred healthcare is not uniformly apparent. For faculty and novice students, it can be a challenge to appreciate the formation of a person-centred approach, and even more difficult to participate in the assessment activities. Students can be influenced by faculty involvement in the development of assessment exemplars.
The value of using liberal arts pedagogy to teach person-centred healthcare is not uniformly apparent. For faculty and novice students, it can be a challenge to appreciate the formation of a person-centred approach, and even more difficult to participate in the assessment activities. Students can be influenced by faculty involvement in the development of assessment exemplars.
Conclusions:
Programmatic assessment can serve as a scaffold to intentionally use pedagogical features of liberal arts to assist health professional students to build a repertoire of person- centred skills.
Programmatic assessment can serve as a scaffold to intentionally use pedagogical features of liberal arts to assist health professional students to build a repertoire of person- centred skills.
Take-home messages:
Linking student engagement and employability with liberal arts pedagogical features provides educators with agency to teach skills required for person-centred care
A programmatic assessment design is a versatile program structure to enable the realisation of integrating liberal arts pedagogy into a health professional curriculum
Faculty are not just the judgers but participants in the assessment culture
References (maximum three)
Lewis P. "Globalizing the liberal arts: Twenty-first-century education." Higher education in the era of the fourth industrial revolution (2018): 15-38.
Leadbeatter, D., & Gao, J. (2018). Engaging oral health students in learning basic science through assessment that weaves in personal experience. Journal of Dental Education, 82(4), 388-398.
Leadbeatter, Delyse, Shanika Nanayakkara, Xiaoyan Zhou, and Jinlong Gao. "Employability in health professional education: a scoping review." BMC Medical Education 23, no. 1 (2023): 1-11.
10:45 am
Lizemari Hugo-van Dyk1
Danelle Haumann1, Annali Fichardt1 and Champion N. Nyoni1 1
University of the Free State
Danelle Haumann1, Annali Fichardt1 and Champion N. Nyoni1 1
University of the Free State
Qualifications authorities and other health professions regulatory bodies often recommend integrative assessment as a preferred assessment approach. Integrative assessment aims to combine students’ learning from multiple modules into a single assessment. However, the concept of integrative assessment is unclear, leading to challenges in the interpretation and implementation in higher education institutions. Understanding the concept can give clarity on how effectively define and implement integrative assessment.
A concept analysis of integrative assessment in the context of health professions education was done using Walker and Avant (2014). A literature search using 15 databases, resulted in 921 abstracts of which 19 articles were relevant.
Defining attributes of integrative assessment were extracted and analysed, resulting in ten themes, namely assessment tasks, competence, learning, context, curriculum, instruments, tools, measures and scoring, multiple competencies, interdisciplinary involvement, feedback, and systems. Each theme had subthemes, adding depth and detail to the concept. The antecedents and consequences of integrative assessment were analysed, resulting in eight and six themes respectively.
A conceptual definition for integrative assessment was produced: "Integrative assessment is a holistic approach, informed by an integrated curriculum, that includes diverse integrated assessment tasks to evaluate student learning and performance by assessing competencies across disciplines in an authentic context, using various scoring instruments to provide feedback on students’ competence within a robust assessment system". Additionally, the study found no empirical referents of integrative assessment, which influence the quality of implementation thereof.
A conceptual definition of integrative assessment was generated, which can guide higher education institutions in implementing integrative assessment in alignment with policy and guidelines. Effective implementation of integrative assessment can lead to higher competence levels of practicing health professionals. Research is needed on the implementation of integrative assessment within health professions education. There is a dare need to develop an instrument to measure the implementation of integrative assessment.
References (maximum three)
Walker, L.O. & Avant, K.C. 2014. Strategies for theory construction in nursing, 5th ed. Boston: Prentice Hall.