Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

Programmatic assessment approaches

Oral Presentation

Oral Presentation

11:00 am

27 February 2024

M210

Session Program

Nancy Moreno1
Peter Boedeker1 and Nadia Ismail1
1 Baylor College of Medicine 


Background
Many medical and health professions education programs are embracing innovations, such as active learning, integrated curricula, early clinical experiences, and opportunities for student research. The effectiveness of such innovations, however, hinges on the quality of their implementation by key players, such as content developers, facilitators, and coordinators. Examining the concerns of the implementers and their levels of implementation is essential for informed decision-making and program improvement. The evidence-driven Concerns-Based Adoption Model (CBAM) is a useful conceptual framework for this purpose because it holistically considers and helps evaluate the technical, organizational, and personal aspects of the change process. (1) 


Importance of Topic
The successful implementation of educational innovations requires a thorough examination and resolution of educator concerns and needs—a major challenge highlighted in a recent study on curricular reforms in medical education across North America. (2) The CBAM framework becomes a valuable tool in this process because it assesses the degree to which individuals integrate innovations into their practice, thereby guiding the design of appropriate intervention and support strategies. Moreover, it specifically addresses Kirkpatrick Level 3 (behavior), a critical aspect often overlooked in many program evaluation or assessment frameworks. (3) 

In this session, an experienced curriculum and evaluation/assessment team will guide participants through each of the three elements of the model and provide opportunities to plan for applying the tools at their own locations. 


Workshop Format.
The workshop will apply active learning approaches throughout the session. 

  • Session Introduction: Think-pair-share activity to discover participants’ current program assessment and evaluation challenges or opportunities. 
  • Understanding the Concerns-Based Adoption Model (CBAM): Overview of the seven categories of individual concerns regarding an innovation, followed by a small group activity identifying faculty concerns related to education reforms at their own institutions. 
  • Applying the Model: Case study of how the CBAM model is being applied at one institution. 
  • Small Group Activity: Each group collaboratively develops an example “innovation configuration,” with specific descriptions of what a new program should look like in practice, including expected levels of use. 
  • Presentations and Discussion: Groups will describe their innovation configurations; followed by discussion of how the approach can be used for longitudinal program evaluation and to gauge fidelity of implementation. 
  • Concluding Discussion: Group discussion of key take aways, such as, How might this evaluation approach be used in your own situation? How does estimating individual stages of concern impact faculty development priorities? How do the data provided through this approach complement other ongoing assessments at your institution? 


Who Should Participate.
This workshop is relevant to a wide range of conference participants, including curriculum developers, course leaders and individual engaged in program assessments or evaluations. They will gain experience with a comprehensive, stakeholder- centered and research-based approach to educational program implementation evaluation.


Level of Workshop.
Beginner and intermediate


Workshop Outcomes
After participating in this workshop, participants will be able to: 
  • describe the three elements of the Concerns-Based Adoption Model framework, and 
  • apply the framework for formative and summative program evaluations of educational 
  • innovations (curriculum, teaching practices, etc.) at their own institutions. 



References (maximum three) 

  1. Hord S, Rutherford W, Huling L, Hall G. (2014). Taking Charge of Change. Austin, TX USA: SEDL. 

  2. Pock A, Durning S, et al. (2019). Post-Carnegie II curricular reform: a north American survey of emerging trends & challenges. BMC Med Educ 19, 260. 

  3. Nouraey P, Al-Badi A, Riasati M, Maata R. (2020). Educational Program and Curriculum Evaluation Models: A Mini Systematic Review of the Recent Trends. Universal Journal of Educational Research 8(9). 4048–4055. 

Michelle Daniel1
Stuart Lane2, Chris Roberts, James Murphy3, Holly Caretta-Weyer4, Eric Holmboe5, Brian Kwan1, Dario Torre6 and Priya Khanna7
1 University of California, San Diego
2 Sydney Medical School
3 University of California San Diego School of Medicine
4 Stanford University School of Medicine
5 Accreditation Council for Graduate Medical Education
6 University of Central Florida
7 School of Medicine, The University of Sydney




Background 
Programmatic assessment (PA) is changing the way we think about assessment in a post psychometric, competency-based education era. Unlike traditional “module-test” assessment models, where the focus is largely on assessment of learning, programmatic assessment offers a holistic, longitudinal approach to assessment that combines assessment of learning with assessment for learning. PA can be applied to any competency within larger systems of assessment. 


The Principles 
PA requires information about learners’ competence to be collected and analyzed continuously. Multiple assessment data points are accumulated over time from a mix of assessment methods, to create an increasingly detailed picture of learner competence. There is a constant reflective dialogue with the learner about both individual assessment data points, goals and competency trajectories, for the purpose of providing rich feedback. Decisions about competence and progress are made remote from any individual assessment, course or clerkship through the integration of data. Credible, equitable decisions are based on the principles of proportionality and triangulation. 


The Challenge 
While the theoretical principles of programmatic assessment are clear, implementation remains a significant challenge. In this workshop, we will discuss three keys to successful implementation of PA: 

1) Robust learner dashboards and data analytics (to collect and integrate multiple assessment data points, potentially leveraging newer technology (e.g., machine learning) 

2) Competency Committees (to review data separate from individual course and clerkships, and make decisions on progress) 

3) Coaching and advising programs (to support reflective dialogue and ensure feedback) 


Why is the topic important for research and / or practice?
While many educators embrace the principles of programmatic assessment, implementation is often difficult. By focusing on keys to success, we hope to address some of the frequently encountered barriers to full implementation of the model. 


Workshop format, including participant engagement methods
Initial think-pair-share exercise (What problem(s) is PA trying to solve? Why do we need to embrace PA?) (10 min); Didactic on principles of PA and an overview of implementation challenges (20 min). Panel discussion on keys to successful PA implementation (30 min). 


Who should participate?
Faculty who design / implement curricula and programs of assessment 


Level of workshop
Novice / intermediate 


Take-home messages / workshop outcomes / implications for further research or practice
Understanding and addressing implementation challenges helps ensure that programmatic assessment is executed successfully and yields its intended benefits. This, in turn, can have a positive impact on teaching and learning outcomes, enabling educators to better support their learners' growth and development. 


Maximum number of participants
50 



References (maximum three) 

Heeneman S, de Jong LH, Dawson LJ, Wilkinson TJ, Ryan A, Tait GR, Rice N, Torre D, Freeman A, van der Vleuten CP. Ottawa 2020 consensus statement for programmatic assessment–1. Agreement on the principles. Medical teacher. 2021 Oct 3;43(10):1139-48. 

Torre D, Rice NE, Ryan A, Bok H, Dawson LJ, Bierer B, Wilkinson TJ, Tait GR, Laughlin T, Veerapen K, Heeneman S. Ottawa 2020 consensus statements for programmatic assessment– 2. Implementation and practice. Medical teacher. 2021 Oct 3;43(10):1149-60.