Does Complex Analysis (IRT) Pay Any Dividends in Achievement Testing?

Authors

  • John O. Anderson

DOI:

https://doi.org/10.11575/ajer.v45i4.54708

Abstract

The study was an exploratory investigation of the consequences of using a complex test-anditem analysis approach in a large-scale testing situation that historically has used a conventional approach of simple number-right scoring. In contemplating modifications to a complex, high-stakes testing program that has a long history of successful operation, any change in operations would have to be carefully evaluated to ensure that there is a high probability of improvement through change. So if a change from number-right-type scoring to item response theory (IRT) scoring is under consideration, the question arises: Does the increase in complexity and difficulty associated with the use of IRT pay significant dividends in better achievement estimates? In terms of consequences, it did not make much difference which domain score estimate was selected for use: any estimate gives approximately the same results in terms of mean, standard deviation, error of estimation, and correlation to other sources of estimation of student achievement.

Downloads

Published

1999-12-01

How to Cite

Anderson, J. O. (1999). Does Complex Analysis (IRT) Pay Any Dividends in Achievement Testing?. Alberta Journal of Educational Research, 45(4). https://doi.org/10.11575/ajer.v45i4.54708