Providing quality feedback to general internal medicine residents in a competency-based assessment environment
DOI:
https://doi.org/10.36834/cmej.57323Abstract
Construct: Competence Based Medical Education (CBME) is designed to use workplace-based assessment (WBA) tools to provide observed assessment and feedback on resident competence. Moreover, WBAs are expected to provide evidence beyond that of more traditional mid- or end-of-rotation assessments [e.g., In Training Evaluation Records (ITERs)]. In this study we investigate competence in General Internal Medicine (GIM), by contrasting WBA and ITER assessment tools.
Background: WBAs are hypothesized to improve and differentiate written and numerical feedback to support the development and documentation of competence. In this study we investigate residents’ and faculty members’ perceptions of WBA validity, usability, and reliability and the extent to which WBAs differentiate residents’ performance when compared to ITERs.
Approach: We used a mixed methods approach over a three-year period, including perspectives gathered from focus groups, interviews, along with numerical and narrative comparisons between WBA and ITERs in one GIM program.
Results: Residents indicated that the narrative component of feedback was more constructive and effective than numerical scores. They perceived the focus on specific workplace-based feedback was more effective than ITERs. However, quantitative analysis showed that overall rates of actionable feedback, including both ITERs and WBAs, were low (26%), with only 9% providing an improvement strategy. The provision of quality feedback was not statistically significantly different between tools; although WBAs provided more actionable feedback, ITERs provided more strategies. Statistical analyses showed that more than half of all assessments came from 11 core faculty.
Conclusions: Participants in this study viewed narrative, actionable and specific feedback as essential, and an overall preference was found for written feedback over numerical assessments. However, quantitative analyses showed that specific actionable feedback was rarely documented, despite qualitative emphasis from both groups of its importance for developing competency. Neither formative WBAs or summative ITERs clearly provided better feedback, and both may still have a role in overall resident evaluation. Participant views differed in roles and responsibilities, with residents stating that faculty should be responsible for initiating assessments and vice-versa. These results reveal a disconnect between resident and faculty perceptions and practice around giving feedback and emphasize opportunities for programs adopting and implementing CBME to address how best to support residents and frontline clinical teachers.Metrics
Downloads
Published
How to Cite
Issue
Section
License
Submission of an original manuscript to the Canadian Medical Education Journal will be taken to mean that it represents original work not previously published, that it is not being considered elsewhere for publication. If accepted for publication, it will be published online and it will not be published elsewhere in the same form, for commercial purposes, in any language, without the consent of the publisher.
Authors who publish in the Canadian Medical Education Journal agree to release their articles under the Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 Canada Licence. This licence allows anyone to copy and distribute the article for non-commercial purposes provided that appropriate attribution is given. For details of the rights an author grants users of their work, please see the licence summary and the full licence.