Presentation Description
Zheng-Wei Lee1
Olivia Ng1, Li Li1, Jowe Chu1, , Lucy Victoria Everett Wilding1, , Jennifer Anne Cleland1 and Dong Haur Phua1
1 Lee Kong Chian School of Medicine, Nanyang Technological University
Olivia Ng1, Li Li1, Jowe Chu1, , Lucy Victoria Everett Wilding1, , Jennifer Anne Cleland1 and Dong Haur Phua1
1 Lee Kong Chian School of Medicine, Nanyang Technological University
Major curriculum reform is typically accompanied by review of assessment processes. Reports of assessment change implementation are scarce nor is there guidance in respect of using technology and data analytics productively to inform and communicate assessment data. Thus, this abstract aims to provide an overview of the preliminary stages of introducing programmatic assessment (PA) as part of an MBBS curricular reform. PA shifts the emphasis from solely formal examination-type assessments to authentic competencies-based assessment from multiple sources, including clinical workplace assessments, and time points (Norcini et al., 2018; van der Vleuten et al., 2012). We focused to develop an assessment dashboard that provides clear alignment between curriculum and assessment, signals progress and promotes self-regulated learning for students, by adopting a human-centric approach using the four iterative and solution-focused phases - discovery, ideation, experimentation, and evolution - design thinking framework (Henriksen et al., 2017). At the discovery phase, we identified our primary challenge as consolidating different assessment data. Streaming the assessment data infrastructure was a critical - and highly complex - first step, requiring engagement with various stakeholders including faculty members, administrators, and students, to understand their needs and requirements for an assessment dashboard, as well as identifying technological enablers (ideation). At this stage we also had to constructively align our learning outcomes and assessment items with the Singapore National Framework. Next, we reviewed our data architecture by testing the gathering of assessment information from different formative assessment items and learning platforms (experimentation). Evolution in the form of piloting will happen later in 2024. We report on this user-centric development process to help others considering a shift to PA, to offer transferable and practical working insights for stakeholders who seek to design personalised feedback which is fit for context, and uses technology and data analytics effectively.
References (maximum three)
Henriksen, D., Richardson, C., & Mehta, R. (2017). Design thinking: A creative approach to educational problems of practice. Thinking skills and Creativity, 26, 140-153. https://doi.org/10.1016/j.tsc.2017.10.001
Norcini, J., Anderson, M. B., Bollela, V., Burch, V., Costa, M. J., Duvivier, R., Hays, R., Palacios Mackay, M. F., Roberts, T., & Swanson, D. (2018). 2018 consensus framework for good assessment. Medical Teacher, 40(11), 1102-1109. https://doi.org/10.1080/0142159X.2018.1500016
van der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Dijkstra, J., Tigelaar, D., Baartman, L. K. J., & van Tartwijk, J. (2012). A model for programmatic assessment fit for purpose. Medical Teacher, 34(3), 205–214. https://doi.org/10.3109/0142159x.2012.652239