User experience of the Written Exam Question Quality tool to inform the writing of new written-exam questions

Authors

DOI:

https://doi.org/10.36834/cmej.72320

Abstract

Background: Creating new written-exam questions is a burdensome task for faculty members. While several guidelines exist, there had not been a previous attempt to streamline them in a user-friendly tool. We created the Written Exam Question Quality tool (WEQQ) and explored potential users’ perception of this tool when writing their exam questions.

Methods: We conducted a descriptive study to explore how four Canadian faculty members used the WEQQ. We conducted structured interviews that were analyzed within and across participants to understand the latter’s perceived usefulness and acceptability of the WEQQ. Quantitative data from a short questionnaire on creating exam questions and their psychometric properties were also collected.

Results and conclusion: Participants’ perception of the WEQQ was positive, and they were favorable to its use. The WEQQ seemed to represent a user-friendly, easy way to help faculty members in creating multiple-choice or short-answer questions. Time on task remained the same when using the WEQQ. We were able to identify two user profiles, passive and active, which indicated how faculty members use the WEQQ to create exam questions. Future steps would be to further investigate if the WEQQ can increase the quality of written-exam questions and to understand how to promote an active use of the WEQQ when implementing this tool.

Metrics

Metrics Loading ...

References

Haladyna TM, Downing SM, Rodriguez MC. A review of multiple-choice item-writing guidelines for classroom assessment. Appl Meas Educ. 2002;15(3):309–33. https://doi.org/10.1207/S15324818AME1503_5

Paniagua M, Swygert KA, Downing SM. Written tests: writing high-quality constructed-response and selected-response items. Assess Health Prof Educ. 2019;109–26.

Vachon Lachiver É, St-Onge C, Cloutier J, Farand P. La rédaction de questions à choix multiple et de questions à réponse ouverte et courte pour les examens écrits dans les programmes de formation en santé : une étude docimologique des consignes de rédaction. Pédagogie Médicale. 2017;18(2):55–64. https://doi.org/10.1051/pmed/2018002

Denzin NK, Lincoln YS, Denzin NK, Lincoln YS. Handbook of qualitative research. Handb Qual Res. 1994; Available from: https://search.ebscohost.com/login.aspx?direct=true&db=snh&AN=COMP0998985617&site=ehost-live

Denzin NK, Lincoln YS (Eds). The Sage handbook of qualitative research. Thousand Oaks, CA: SAGE; 2011.

Young ME, Ryan A. Postpositivism in health professions education scholarship. Acad Med J Assoc Am Med Coll. 2020;95(5):695–9. https://doi.org/10.1097/ACM.0000000000003089

Dedoose Version D Version 7.0.2, web application for managing, analyzing, and presenting qualitative and mixed method research data (2016). Los Angeles, CA: SocioCultural Research Consultants, LLC. www.dedoose.com. 2016

Carroll JB. Correcting point-biserial and biserial correlation coefficients for chance success. Educ Psychol Meas. 1987;47(2):359–60. https://doi.org/10.1177/0013164487472007

DeMars CE. Classical test theory and item response theory. Wiley Handb Psychom Test Multidiscip Ref Surv Scale Test Dev. 2018;49–73.

IBM Corp. Released 2016. IBM SPSS Statistics for Windows, Version 24.0. Armonk, NY: IBM Corp.; 2016.

Thomas A, Bussières A. Knowledge Translation and Implementation Science in Health Professions Education: Time for clarity? Acad Med J Assoc Am Med Coll. 2016;91(12):e20. https://doi.org/10.1097/ACM.0000000000001396

Downloads

Published

2024-09-24

How to Cite

1.
Vachon Lachiver E, St-Onge C. User experience of the Written Exam Question Quality tool to inform the writing of new written-exam questions . Can. Med. Ed. J [Internet]. 2024 Sep. 24 [cited 2024 Nov. 24];. Available from: https://journalhosting.ucalgary.ca/index.php/cmej/article/view/72320

Issue

Section

Brief Reports

Most read articles by the same author(s)