LMS assessment: using IRT analysis to detect defective multiple-choice test items

Panagiotis Fotaris, Theodoros Mastoras

Research output: Contribution to journalArticlepeer-review

Abstract

Due to the computerisation of assessment tests, the use of Item Response Theory (IRT) has become commonplace for educational assessment development, evaluation and refinement. When used appropriately by a Learning Management System (LMS), IRT can improve the assessment quality, increase the efficiency of the testing process and provide in-depth descriptions of item properties. This paper introduces a methodological and architectural framework which embeds an IRT analysis tool in an LMS so as to extend its functionality with assessment optimisation support. By applying a set of validity rules to the statistical indices produced by the IRT analysis, the enhanced LMS is able to detect several defective items from an item pool which are then reported for reviewing of their content. Assessment refinement is achieved by repeatedly employing this process until all flawed items are eliminated.
Original languageEnglish
Pages (from-to)281-296
Number of pages16
JournalInternational Journal of Technology Enhanced Learning
Volume6
Issue number4
DOIs
Publication statusPublished - 1 Dec 2014

Fingerprint

Dive into the research topics of 'LMS assessment: using IRT analysis to detect defective multiple-choice test items'. Together they form a unique fingerprint.

Cite this