z-logo
Premium
Automated Test‐Form Generation
Author(s) -
Linden Wim J. van der,
Diao Qi
Publication year - 2011
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/j.1745-3984.2011.00140.x
Subject(s) - disk formatting , test (biology) , computer science , set (abstract data type) , item bank , test suite , integer programming , test management approach , order (exchange) , test case , data mining , algorithm , programming language , machine learning , software , statistics , mathematics , item response theory , software development , paleontology , psychometrics , software construction , regression analysis , finance , economics , biology , operating system
In automated test assembly (ATA), the methodology of mixed‐integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different cases are discussed: (i) computerized test forms in which the items are presented on a screen one at a time and only their optimal order has to be determined; (ii) paper forms in which the items need to be ordered and paginated and the typical goal is to minimize paper use; and (iii) published test forms with the same requirements but a more sophisticated layout (e.g., double‐column print). For each case, a menu of possible test‐form specifications is identified, and it is shown how they can be modeled as linear constraints using 0–1 decision variables. The methodology is demonstrated using two empirical examples.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here