Premium
Do Teachers Consider Advice? On the Acceptance of Computerized Expert Models
Author(s) -
Kaufmann Esther,
Budescu David V.
Publication year - 2019
Publication title -
journal of educational measurement
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.917
H-Index - 47
eISSN - 1745-3984
pISSN - 0022-0655
DOI - 10.1111/jedm.12251
Subject(s) - advice (programming) , task (project management) , salient , psychology , test (biology) , quality (philosophy) , medical education , applied psychology , expert opinion , computer science , artificial intelligence , medicine , engineering , paleontology , philosophy , systems engineering , epistemology , biology , programming language , intensive care medicine
The literature suggests that simple expert (mathematical) models can improve the quality of decisions, but people are not always eager to accept and endorse such models. We ran three online experiments to test the receptiveness to advice from computerized expert models. Middle‐ and high‐school teachers ( N = 435) evaluated student profiles that varied in several personal and task relevant factors. They were offered (Studies I and II), or could ask for (Study III), advice from either expert models or human advisors. Overall, teachers requested and followed advice of expert models less frequently than advice from humans. Task‐relevant factors (task difficulty) seem to be more salient than personal factors for teachers’ willingness to receive advice.