Benchmarking a Canadian anesthesiology resident research program against national norms using a logic model framework: a quality improvement study.




Background: Canadian specialty training programs are expected to deliver curriculum content and assess competencies related to the CanMEDS Scholar role. We evaluated our residency research program and benchmarked it against national norms for quality improvement purposes.

Methods: In 2021, we reviewed departmental curriculum documents and surveyed current and recently graduated residents.  We applied a logic model framework to assess if our program’s inputs, activities, and outputs addressed the relevant CanMeds Scholar competencies.  We then descriptively benchmarked our results against a 2021 environmental scan of Canadian anesthesiology resident research programs.

Results: Local program content was successfully mapped to competencies.  The local survey response rate was 40/55 (73%).  In benchmarking, our program excelled in providing milestone-related assessments, research funding, administrative, supervisory, and methodologic support, and requiring a literature review, proposal presentation, and local abstract submission as output.  Acceptable activities to meet research requirements vary greatly among programs.  Balancing competing clinical and research responsibilities was a frequently reported challenge.  

Conclusions: The logic model framework was easily applied and demonstrated our program benchmarked well against national norms.  National level dialogue is needed to develop specific, consistent scholar role activities and competency assessments to bridge the gap between expected outcome standards and education practice.


Frank JR, Snell L, Sherbino J. CanMEDS 2015 physician competency framework; 2015.

The Royal College of Physicians and Surgeons of Canada. Anesthesiology Competencies (2017 - Editorial Revision 2021; Version 1.0); 2021.

Richardson D, Oswald A, Chan M, Lang E, Harvey B. Scholar. In: Frank J, Snell L, Sherbino J, eds. CanMEDS 2015 physician competency framework. Royal College of Physicians and Surgeons of Canada; 2015; 2015.

The Royal College of Physicians and Surgeons of Canada CanMEDS. Milestones. Available from [Accessed April 7, 2022].

The Royal College of Physicians and Surgeons of Canada. EPAs and CanMEDS milestones. Available from [Accessed on Feb 2, 2023].

Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002-1009.

ICE Blog. Introducing a core components framework for competency-based medical education. Nov 18, 2021. Available from [Accessed on Feb 2, 2023].

The Royal College of Physicians and Surgeons of Canada. What is CBD? Available from [Accessed on Feb 2, 2023].

Mutter T, Girling L. Resident Research in the CBME Era: A Report of a Survey of ACUDA Research Committee Members.; 2021.

Chou S, Cole G, McLaughlin K, Lockyer J. CanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfaction. Med Educ. 2008;42(9):879-886.

Binnendyk J, Pack R, Field E, Watling C. Not wanted on the voyage: highlighting intrinsic CanMEDS gaps in Competence by Design curricula. Can Med Educ J. 2021;12(4):2021.

The Royal College of Physicians and Surgeons of Canada. CanMEDS 25. Available from [Accessed on Feb 3, 2023].

Dattakumar R, Jagadeesh R. A review of literature on benchmarking.

Ettorchi -Tardy A, Levif M, Michel P. Benchmarking: a method for continuous quality improvement in health. Healthc Policy. 2012;7(4):e101.

Flesher J, Bragg D. Evaluation and benchmarking module.; 2013. [Accessed April 7, 2022].

Wilkinson TJ, Hudson JN, Mccoll GJ, Hu WCY, Jolly BC, Schuwirth LWT. Medical teacher medical school benchmarking-from tools to programmes. Published online 2014.

Lankford WM. Benchmarking: Understanding The Basics. Coast Bus J. 2002;1(1):57-62.

Anastasopoulos V. Logic models for program evaluation: purpose and parts. In: D’Eon M, ed. CanMedEd-Ipedia: The CORAL Collection. Concepts as online resources for accelerated learning. University of Saskatchewan Teaching and Learning; 2018. [Accessed April 7, 2022].

McLaughlin J, Jordan G. Using Logic Models. In: Wholey JS, Hatry HP, Newcomer KE, eds. Handbook of practical program evaluation second edition. Vol 2004. 2nd ed. Jossey-Bass. [Accessed April 7, 2022].

Jain S, Menon K, Piquette D, Gottesman R, Hutchison J, Gilfoyle E. The development of a critical care resident research curriculum: a needs assessment. Can Respir J. 2016;2016.

Bandiera G, Sherbino J, Frank J. The CanMEDS assessment tools handbook. an introductory guide to assessment methods for the CanMEDS competencies. In: The Royal College of Physicians and Surgeons of Canada; 2006.

Noble C, Billett SR, Phang DTY, Sharma S, Hashem F, Rogers GD. Supporting resident research learning in the workplace: a rapid realist review. Acad Med. 2018;93(11):1732-1740.

Whitehead CR, Kuper A, Hodges B, Ellaway R. Conceptual and practical challenges in the assessment of physician competencies. Med Teach. 2015;37(3):245-251.

Gaboury I, Ouellet K, Xhignesse M, St-Onge C. Strategies identified by program directors to improve adoption of the CanMEDS framework. Can Med Ed J. 2018;9(4):e26-34.

Silcox LC, Ashbury TL, Vandenkerkhof EG, Milne B. Residents’ and program directors’ attitudes toward research during anesthesiology training: a Canadian perspective. Anesth Analg. 2006;102(3):859-864.


2023-02-15 — Updated on 2023-03-21

How to Cite

Barbour-Tuck E, Mutter T, O’Brien JM, Girling L, Choo E, Gamble J. Benchmarking a Canadian anesthesiology resident research program against national norms using a logic model framework: a quality improvement study. Can. Med. Ed. J [Internet]. 2023 Mar. 21 [cited 2023 May 29];14(1):108-16. Available from: