Skip to main content
Ottawa 2024
Times are shown in your local time zone GMT

Workplace-based Assessments

Oral Presentation

Oral Presentation

11:30 am

28 February 2024

M210

Session Program

Luan Au1
My Do1 and Hien Nguyen1
1 University of Medicine and Pharmacy at Hochiminh City (UMP) 



In competency-based medical education, learners acquire and develop professional competencies through practising. Workplace-based assessment (WPBA) and simulation-based assessment accurately provide stakeholders with evidence of learner competencies. In undergraduate training (UGT), mini-clinical evaluation exercises (mini-CEX) and direct observation of procedural skills (DOPS) accurately reflect learner performance. The portfolio is far from an independent WPBA tool. 

This paper summarises changes in the WPBA strategy for UGT during UMP curriculum renovation and discuss potential solutions that might improve the validity of WPBA. 

In the 2010s, the Department of Ob-Gyn began its journey to find an accurate and suitable WPBA strategy. The first edition included four specific mini-CEXs and two specific DOPS. It enhanced learner performance but increased educator workload, therefore provoking negative educators' reactions. The second edition had only one multipurpose and rubric-based mini- CEX. Its complexity negatively impacted learner orientation and educator acceptance. A digital mini-CEX characterised the third edition and aimed to improve user acceptance rate and database management. Again, time consumption discourages educators from running digitalised mini-CEX. We removed DOPS from the two last WPBA editions due to license requirements. We still consider that portfolio is for formative purposes, therefore asked learners to adequately prepare their portfolios for supporting self-awareness. 

From learned failures, we consider that it is mandatory to reform our WPBA strategy. Mini- CEX is still the primary tool, while a portfolio can be a secondary one. Concerning mini-CEX, a series of specific mini-CEX seems likely better than the 'all-in-one'; mini-CEX should use detailed rubrics, which enhance the correlation score-performance; applying flexible exam agenda gives learners autonomy and sounds helpful; preparing educators to conduct mini-CEX is mandatory. Concerning the portfolio, we prioritise crafting a comprehensive user guide and removing unnecessary fields from the current portfolio. We also support digital design and management, which allow ubiquitous use and effective management. 



References (maximum three) 

1. Al Ansari A. The construct and criterion validity of the mini-CEX: a meta-analysis of the published research. Acad Med. 2013;88:413-420. 

  1. Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, Huwendiek S. The educational impact of mini-clinical evaluation exercise (mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: a systematic review and meta-analysis. Plos one. June 4,2018. 

  2. Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Medical Teacher. 2007; 29: 855-871. 

Alyssa Anderson1
Imogene Rothnie2, Libby Newton1 and Susi McCarthy1
1 Royal Australasian College of Physicians
2 ANZAHPE, AES


 

Background 
Work-based assessments (WBAs) play a crucial role in clinical competence assessment as they provide authentic opportunities for tailored feedback for learning. However, a common problem in WBA implementation is trainees’ tendency to approach them with a compliance mindset (Bindal et al. 2011). Drawing from WBA workflows in the Royal Australasian College of Physician’s new Basic Training Program, this study explores factors contributing to trainees’ compliance mindset regarding WBAs and offers strategies to foster a growth mindset. 


Summary of work 
Analysis of WBA feedback records (n=73) was triangulated with results from a trainee survey (n=16, 21%) and interviews (n=4) to construct a representation of a typical WBA workflow. Divergences between expected and intended WBA workflows were examined to elucidate factors contributing to compliance mindsets. 


Results 
While most trainees completed WBAs at the expected standard, workflows showed evidence of trainees employing a compliance mindset rather than deliberately using WBAs to support their learning. Many factors contributed to this mindset, including difficulty identifying constructive WBA tasks, trainee-assessor power dynamics, and generic feedback from assessors focused on gaining experience or confidence, rather than on specific areas of improvement. Other factors such as rotation type, workloads, education technology and the influence of other assessments fostered the compliance mindset. 


Discussion/Conclusion 
An array of factors contributed to trainees’ compliance mindset around WBAs. Strategies to combat this mindset and support a growth mindset include: 1) supporting trainees to identify educationally valuable tasks for WBAs, 2) developing assessors' skills in delivering feedback, 3) ensuring educational technology permits recording of real-time feedback, and 4) supporting trainees to access protected time so they can meaningfully engage with WBAs. 


Take-home messages / Further research 
Future work could focus on developing learners’ assessment literacy in the context of WBAs and support for learners to self-regulate their learning across training contexts and competencies. 



References (maximum three) 

Bindal T, Wall D, Goodyear HM. Trainee doctors' views on workplace-based assessments: Are they just a tick box exercise? Med Teach. 2011;33(11):919-27. doi: 10.3109/0142159X.2011.558140. PMID: 22022902. 

Chak Man Jane Li1
Nidhi Garg2, Karen Scott1, Naomi Staples3 and Venessa Tsang2
1 University of Sydney
2 Sydney University
3 University of Sydney Medical School




Background 
Workplace-based assessments (WBAs) are clinical assessment tools commonly used in competency-based medical education for assessment ‘of’ and/or ‘for’ learning. Their effectiveness and utility are dependent on user-tool-context interactions, including user engagement, assessment literacy, tool attributes and learner-educator relationships (1). This qualitative study aims to explore final year medical students’ perceptions and experiences using WBAs as a learning and assessment tool, with the goal of improving effectiveness and utility. 


Summary 
A portfolio of WBAs, based on Junior Medical Officer core skills, were implemented for final year medical students at Sydney University metropolitan and rural clinical schools. This accorded with Programmatic Assessment philosophy and Australian Medical Council accreditation standards. Using grounded theory, we conducted 3 focus groups with students. Taking a theory-informing inductive data analysis approach, we used line-by-line coding and constant comparison to identify initial patterns (2) and identified approaches to learning theory to interpret the data(3). 


Results 
Students’ learning approaches to WBAs depended on perceptions of usefulness and feasibility. When wanting to improve capability for a specific skill, they took a deep approach, seeking supervisors who gave good feedback. Most WBAs were undertaken through a strategic approach, focused on assessment outcomes. Students distinguished between: “real WBAs”, undertaken as directed, observed by a supervisor; “hypothetical WBAs” in discussion with supervisors, when they appeared unfeasible; and combined WBAs, conflating previous performance, retrospectively signed off by supervisors. Students at rural hospitals felt forced into taking a deep approach because supervisors were familiar with expected standards. 


Discussion 
Students make their own decisions about performing WBAs, based on their learning approach and perception of feasibility. 


Conclusions 
Improving supervisors’ understanding of WBAs and expected standards may help students undertake them as designed and encourage feedback. 


Implications 
Longitudinal research is needed to understand changes to students’ performance and learning as supervisors become more familiar with WBAs. 



References (maximum three) 

  1. Prentice, S., Benson, J., Kirkpatrick, E., & Schuwirth, L. Workplace-based assessments in postgraduate medical education: A hermeneutic review. Medical Education in Review. 2020; 54:981-992. 

  2. Varpio L, Paradis E, Uijtdehaage S, Young M. The Distinctions Between Theory, Theoretical Framework, and Conceptual Framework. Academic Medicine. 2020;95(7):989-994. 

  3. Entwistle, N., Hanley, M., & Hounsell, D. (1979). Identifying distinctive approaches to studying. Higher Education, 8(4), 365-380. 

Damien Clark1
Karen Smart1 and Kelly Hennessy1
1 CQUniversity



1. Background 
Over the past six years, clinical academics from CQUniversity's Bachelor of Oral Health (BOH) program, working together with Learning Design and Innovation (LDI), have created a novel online system designed for student assessment and feedback in workplace-based learning environments. 


2. Summary of work 
In 2017, the team recognised a deficiency in existing systems that could support assessment strategies focused on students. The team integrated criterion-referenced rubrics and written feedback into a secure, online system known as ORAS (Online Real-time Assessment System). In addition to providing students with immediate feedback, ORAS enables remote, real-time monitoring of student performance and progress over multiple sites and with multiple clinical supervisors. 


3. Results 
Previously, assessment was through paper-based workbooks, and there was reliance on clinical supervisors to report concerns with student performance. Using ORAS, academic staff are now able to quickly identify and track students with clinical progress issues to facilitate timely interventions. Time spent collating student grades and monitoring of students at off-campus placements has greatly reduced, and feedback provided to the students by clinical supervisors can now be monitored in real time. Student and clinical supervisor feedback has been positive and has guided improvements to ORAS over the past 6 years. 


4. Discussion 
This presentation highlights a novel method that facilitates student-centred, real-time, effective, and ongoing progress monitoring throughout the term. 


5. Conclusions 
This presentation explores the most recent iteration of ORAS, and the evolution of the software based on stakeholder feedback, along with future possibilities with this new foundation, including integration with student self-reflective practice. 


6. Implications for further research or practice 
Further research is ongoing to evaluate the effectiveness of progress monitoring of students. 


References (maximum three) 

N/A