z-logo
open-access-imgOpen Access
Using a Delphi Approach to Develop Rubric Criteria
Author(s) -
Gayle Lesmond,
Nikita Dawe,
Lisa Romkey,
Susan McCahan
Publication year - 2016
Language(s) - English
Resource type - Conference proceedings
DOI - 10.18260/p.27121
Subject(s) - rubric , delphi method , delphi , computer science , teamwork , outcome (game theory) , work (physics) , medical education , management science , knowledge management , psychology , process management , mathematics education , engineering , medicine , artificial intelligence , political science , mathematics , mathematical economics , law , operating system , mechanical engineering
Recent developments in post-secondary institutions have motivated a shift towards outcomesbased education. A major impetus for this agenda has been the growing need to provide concrete evidence of student learning and institutional effectiveness to various stakeholders. Given this trend, it is important that research be undertaken to explore valid approaches to learning outcomes assessment. The research described here involves the development of valid, non-discipline specific, analytic rubrics that assess learning outcomes in five key areas: communication, design, teamwork, problem analysis and investigation. This paper reports on the methodology used to complete the first stage of rubric development; identifying the standards through which student work is evaluated. In particular, a two-stage Delphi study was designed to identify rubric criteria for assessing problem analysis and investigation. The Delphi technique is an iterative research tool used to elicit input from a panel of experts. It typically involves a series of virtual survey rounds in which experts offer their views anonymously and have the opportunity to refine them based on controlled feedback from earlier rounds. Panel members include 11 experts for investigation and 15 experts for problem analysis from faculty and staff. In the first round, participants were asked to propose learning outcome statements or “indicators” that are important for assessing problem analysis or investigation. In the second and final round, these responses were arranged by major outcome areas and sent to participants for their feedback. They were asked to rate how likely they were to use the indicators, and their importance in the curriculum. The focus of this paper is not the results of this study, but the methodological processes involved in designing and administering a Delphi survey to develop tools for learning outcomes assessment. This includes expert selection, survey design, and analysis of expert responses. Special attention is paid to the challenges of conducting a Delphi study.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom